How to Use Cloudflare Firewall Rules to Protect Your WordPress Website

In this article, we are going to explain how you can protect your WordPress website or any website for that matter using Cloudflare Firewall Rules.

What is Cloudflare Firewall Rules?

Cloudflare Firewall Rules is a firewall offered by Cloudflare which is a powerful and flexible security tool to filter website traffic.

Cloudflare Firewall Rules is available for all Cloudflare plans; the free plan can have up to 5 active Firewall Rules.

A Firewall Rule is made up of two parts:

  • Matching: A defined filter that runs and match your traffic for a string or pattern
  • Action: the action perform on the matched traffic (block, challenge, captcha, allow)

You can also order the firewall rules to override the default sequence which is based on the rule’s action.

Firewall Rules: Matching

Firstly, we have ‘Matching’. With this, you can match traffic to the HTTP request which includes options such as country, hostname, IP Address, URI, referrer, known bots, threat score and various other options.

Known bots (cf.client.bot) is a Cloudflare defined list of known good bots, which includes bots from Google, Apple, Bing, Linkedin, Pingdom, and Yahoo. You are recommended to add cf.client.bot in an Allowed rule to avoid blocking good crawlers which could affect your SEO and monitoring.

Cloudflare also has it’s own algorithm to calculate an IP Addresses reputation and assigns a value of the threat score which ranges from 0 to 100. This is used for Security Level settings under the Firewall which works as follows:

  • High – for scores greater than 0
  • Medium – for scores greater than 14
  • Low – for scores greater than 24
  • Essentially Off – for scores greater than 49

Regular Expression matching is supported for Cloudflare Business and Enterprise plans.

Firewall Rules: Action

With this, you can set to perform an action to filter matched traffic.

  • Block: the traffic is blocked to reach your web application.
  • JS Challenge: JavaScript challenge. Visitors do not have JavaScript support (mostly bots) will be blocked.
  • Challenge (Captcha): Visitor is required to pass a captcha challenge to allow access.

Allow: Traffic is allowed to reach your web application.

Accessing Cloudflare’s Firewall Rules

Through the dashboard, you can set up all your desired rules, to do this, follow these steps:

  • Login your Cloudflare dashboard
  • Select the domain name you want to configure Firewall Rules
  • Click Firewall from the tools at the top
  • Click Firewall Rules
  • Create a new Firewall Rule
  • Search and filter the list of existing rules
  • See a list of existing rules (active and paused)
  • Activate or pause rules (turn on or off)
  • Edit a rule
  • Delete a rule

Here are some page rules you can use to protect your Website

Block certain Countries from visiting your website

Expression Editor:


(ip.geoip.country eq "RU") or (ip.geoip.country eq "HK")

In our example, we are going to block Russia and Hong Kong. You can add as many countries as you wish and then click the ‘OR’ button to add additional countries. You could for example block all countries except the United States for example, in which case we would just change the operator to does not equal and then set United States as the value.

WordPress Security

Expression Editor:


((http.request.uri.path contains "/xmlrpc.php") or (http.request.uri.path contains "/wp-login.php") or (http.request.uri.path contains "/wp-admin/" and not http.request.uri.path contains "/wp-admin/admin-ajax.php" and not http.request.uri.path contains " /wp-admin/theme-editor.php"))

This Firewall Rule will challenge all visitors that try to access WordPress xmlrpc.php, wp-login.php, and /wp-admin (except admin-ajax.php and theme-editor.php). A simple rule like this could block most hack attempts to your WordPress website.

Block bad spam bots using Expression Editor

Expression Editor:


((http.request.uri.path contains "/xmlrpc.php") or (http.request.uri.path contains "/wp-login.php") or (http.request.uri.path contains "/wp-admin/" and not http.request.uri.path contains "/wp-admin/admin-ajax.php" and not http.request.uri.path contains " /wp-admin/theme-editor.php"))

This will create a long list of bots to block. It will block any non-known good bots traffic with a user agent that contain strings such as ‘crawl’, ‘bot’, ‘spider’, and a few other user agents.

Here we add the Firewall rule using the Expression Editor as shown above.

To do this, follow these steps:

  • Click Create a Firewall Rule
  • Give a Rule Name
  • Click Edit expression
  • Copy & Paste the expression into the text area
  • Select Block action
  • Click Deploy to activate the Firewall rule

Block and Challenge users with a certain Threat Score

We mentioned about threat score before, what we can do here is add a rule to challenge users with a threat score (let’s say above 10). Then we can block users with a threat score above 20 for example.

To do this, we would have to create two different rules as shown below. Firstly, we will create the rule to challenge users with a threat score of equal or greater than 10:

Expression Editor:


(cf.threat_score ge 10)

Secondly, we will block users with a threat score of greater than or equal to 20:


(cf.threat_score ge 20)

Checking your Cloudflare Firewall Rules

You can check your Firewall Rules by going to the Firewall Event Log (Firewall > Events), which will list the firewall events (allow, challenge, block) and their details.

Take note on the challenged and blocked events. You do not want to mistakenly block good traffic because of a wrongly configured Firewall rule.

Conclusion

From this, we have learned what a CloudFlare Firewall Rule is and how to configure it to filter traffic and protect your website. We have also gone through how the Expression Editor works for writing more complex firewall rules.

We sure hope you have found this tutorial useful. If you have any further suggestions for Firewall Rules, do let us know. If you need any assistance with this then get in touch by dropping a comment below.

Nathan da Silva - Profile

Posted by: Nathan da Silva

Nathan is the Founder of Silva Web Designs. He is passionate about web development, website design and basically anything digital related. His main expertise is with WordPress, Magento, Shopify as well of many other frameworks. Whether you need responsive design, SEO, speed optimisation or anything else in the world of digital then get in touch. If you would like to work with Nathan, simply drop him an email at [email protected]

It’s good to share

Should you choose a freelance team over a digital agency?

If you are a project manager, or you are responsible for finding talent for the company that you work for, you may be tempted to turn to large digital agencies for your project. Sometimes, though, it makes more sense to choose a freelance team over a digital agency.

Looking to a large agency is a perfectly natural response and it does make a certain amount of sense: established reputation, larger teams to work on your project, etc.

The trouble is that the two reliability signals mentioned above are largely perception based. Reviews can be faked, for example, leading to false signals in terms of reliability and professionalism. When choosing a freelance team over a digital agency, you are more likely to head for a freelance network such as PeoplePerHour.

Reputation signals here, such as reviews and ranking, are near impossible to spoof which makes them much more reliable and trustworthy. If the numbers say a freelance team is of very high quality, you can take that to the bank.

Of course, an established reputation is not the only consideration.

Cost versus quality when you choose a freelance team over a digital agency

Contrary to popular belief, the cost does not always reflect quality. For professional freelance teams, the quality is always going to be on a par with their much more expensive digital agency counterparts. In fact, very often, the quality coming out of a freelance studio is even better.

It may well seem like the opposite would be true, but let’s take a closer look at this. Agencies have lots of staff, in terms of the creatives and the behind-the-scenes staff that keep everything else ticking over.

All of this is very expensive, as you can imagine, and this isn’t taking other running costs into consideration either. Things like office space for all those people, equipment, IT departments, exorbitant advertising costs to stay ahead of other agencies… it’s a very expensive business.

This, of course, is reflected in the cost to the purchaser, the client, but in order to keep even those rates down, so they actually sell services, to begin with, they have to do something else too.

Digital agencies are under immense pressure to make sales, keep the revenue coming in and keep the business going. To this end, each creative department is under a massive workload weight. With strict output targets to meet, so they serve as many clients as possible, quality is obviously going to suffer.

In the instances where quality is maintained, there is going to be other areas that are lacking: delivery times, customer support, query responses, time taken for edits to be performed… everything is affected when companies put staff under undue pressure.

Why choose a freelance team over a digital agency? What’s the difference, surely freelance teams are under similar pressure to stay afloat? Not so. Let me explain.

Freelance teams are often the better option

A bold claim, right? Maybe, but it’s entirely justified. Depending on your actual needs, you may be much better off if you choose a freelance team over a digital agency. Here are just a few reasons why that statement is true.

Freelance teams are cost-effective

Because freelance teams have much lower overheads (they rarely have multi-user studios, with members of their team often working remotely) there is much less pressure on them to stack clients like an air traffic controller in order to pay their bills. In fact, a freelance team can take fewer clients at one time and still meet financial requirements.

What does all of this add up to? The quality of work that is produced remains the same, and very often it is actually increased as pressure is reduced, but the cost of the creative work is brought down and sometimes dramatically.

Cost does not equal quality, and freelance teams have been proving exactly that for years.

You know exactly who is working on your project

When deciding whether you should choose a freelance team over a digital agency, you need to decide how much you value accountability. Sure, the agency as a whole takes ultimate responsibility but are you entirely comfortable not knowing who did what, why and when?

Agencies are pretty well known for their inherent lack of transparency, which is not something many companies are comfortable with.

With a freelance team of creatives, you always know who is working on what, as teams are usually comprised of people with very specific skillsets and the teams are, by their very nature, quite small. Many freelance teams will also brief you on who will be working on which part of the project for you.

All of this helps to instil a level of trust that just isn’t possible with a digital agency.

Freelance teams are heavily invested in your success

A freelance team wants your business to succeed… Scratch that, they need your business to succeed. If their own business is to survive, they need to be sure that they do the very best possible job for you. A customer that is dissatisfied is very likely a customer that they will never see again. Worse, they could actively damage overall reputation.

Freelance teams are by nature heavily invested in the success of their customer’s business, particularly where the work they have done has a direct influence on their success or failure. It’s because of this, above all other reasons, that the question “should you choose a freelance team over a digital agency” stops being a question.

A digital agency, unless they are totally useless, will survive a customer’s failed project or business. A freelance team has much more to lose, so you can be sure they are going to put every effort into making sure your project is everything that it can be.

Digital agencies are not usually as responsive as freelance teams

Digital agencies typically only operate between the hours of 9 am and 5 pm, sometimes a little later but not usually. Outside of these hours, they are not normally contactable, so if you need to get in touch with them regarding your project you will almost always have to wait until the next business day.

Many freelance teams are comprised of creatives in several different countries. Silva Web Designs, for example, has freelancers in the United Kingdom, the United States, Australia and Portugal. What these varied time zones mean for our clients is that we are contactable 24 hours a day, 7 days a week.

Ask yourself “how quickly do I need an answer to something that is time-critical?”.

The answer is pretty obvious, but if you are responsible as to the answer is pretty obvious, but if you are responsible as to whether or not your company is to choose a freelance team over a digital agency then you should know that freelancers are much more responsive. Their business depends on it, and if yours does too then you already have the answer to the overall question… you choose the freelance team. whether or not your company is to choose a freelance team over a digital agency then you should know that freelancers are much more responsive. Their business depends on it, and if yours does too then you already have the answer to the overall question… you choose the freelance team.

Nathan da Silva - Profile

Posted by: Nathan da Silva

Nathan is the Founder of Silva Web Designs. He is passionate about web development, website design and basically anything digital related. His main expertise is with WordPress, Magento, Shopify as well of many other frameworks. Whether you need responsive design, SEO, speed optimisation or anything else in the world of digital then get in touch. If you would like to work with Nathan, simply drop him an email at [email protected]

It’s good to share

The 8 Best Cloudflare Page Rules For Any WordPress Site

If you are looking to add the best Cloudflare page rules for your WordPress website, then you are in the right place!

What these page rules will do are:

  • Save Bandwidth
  • Improve Security
  • Bypass WordPress Admin Caching
  • Prevent Spam Bots Collecting Email Addresses
  • and much more!

Do note, however, that Cloudflare free accounts only give you three different page rules, we will list the priority ones first in this tutorial.

As well as Page Rules though, don’t forget to configure the other settings in your Cloudflare dashboard and to use Firewalls rules to block bots from hitting your site excessively and consuming resources.

Rule 1. Secure the WordPress Admin and Bypass Cache

In your WordPress Admin Dashboard, you should have a few settings which we can combine in a single page rule. What we will do he is; set the security level to high and bypass Cloudflare’s cache (as there is no need to cache the admin area). We should also disable Cloudflare apps and performance features (such as minify, Rocket Loader, Mirage, Polish, etc…). We only want to speed these things up on the frontend, which is why we are disabling this in the admin backend.

So, for your page URL, you should use this:


yourwebsite.com/wp-admin*

Your page rules will end up looking like this:

2. Decrease Bandwidth Of WP Uploads

So, WordPress upload files do not change very often, there isn’t really a need to have to cache them as often which saves a lot of bandwidth. We can achieve this by setting Edge Cache TTL to a month. If you need to update certain files or directories before a month; you can always purge the cache for individual files within Cloudflare.

We are also going to be setting the browser cache TTL is set to a day. This sets the expiration time for resources cached in a visitor’s browser, an item often shown in GTmetrix.

So with these rules, your page URL would become:


yourwebsite.com/wp-content/uploads*

With your page rules looking something like this:

3. Stop Bots From Collecting Your Email

What this page rule will do is hide your email address from bots (so they don’t get used to spam you). The email address will still be fully visible within your website to humans though. The general rule here is enabling email obfuscation on any page that contains your email address which will, in turn, prevent your spam. You can also turn it on globally in Cloudflare’s Scrape Shield settings and then change this to be on any page.

Let’s say you only have a visible email address on the contact page, then we can simply add this page rule URL:


yourwebsite.com/contact

And your page rule settings would look as follows:

4. Don’t Cache Preview Pages

This simply will bypass Cloudflare’s cache if it’s in a preview page of a page or post. This helps especially when updating a live website, on a preview page you don’t want to see a cached version when performing updates right?

Page URL:

<code class="language-HTML">
yourwebsite.com/*preview=true*
<code>

And your page rule settings would look as follows:

5. Forward XMLRPC URLs

What this page rule will do is significantly improve the security of hackers using XMLRPC for their attacks. This forwards requests from your xmlrpc.php file to any URL on your site, i.e. your homepage.

Your Page URL will become:


yourwebsite.com/xmlrpc.php*

And your page rule settings will look as follows:

6. Make Important Pages Always Online

As it says, Always Online will keep your most important pages online if your server goes down and can be turned on for the most important pages of your website. As an example, this could be your homepage, contact page, portfolio page and so on...). So what this does is that if anything was to happen to your WordPress website, your most important pages will remain visible.

To do this, set your Page URL to:


yourwebsite.com/url-of-important-page

Then your page rules will look like this:

7. eCommerce Sites And Dynamic Content Using AJAX

eCommerce websites include dynamic content (which shouldn't be cached) but you still want to cache everything else. A good solution is to cache the entire page, but bypass the cache for dynamic (eCommerce) elements like AJAX requests. To achieve this, it requires using 2 separate page rules.

The first-page rule bypasses cache for AJAX requests:


yourwebsite.com/ajax*

This will result in something as per the below:

The second rule we will be adding caches everything else. When ordering page rules, make sure the AJAX rule is before the Cache Everything rule. In other words, this page rule should be ordered last.


yourwebsite.com/*

Which will result in the below:

8. A Rule to Force HTTPS connections

This forces all visitors to connect to your website through HTTPS. This means that all visits through HTTP will redirect to the HTTPS version.

This can be added as follows:


http://*yourwebsite.com/*

The page rules will look as follows:

However, since there is already an option, you can simply enable this in your Cloudflare dashboard under SSL/TLS → Edge Certificates → Always Use HTTPS. This saves you from having to use one of your 3-page rules which is why we mentioned this one last.

Conclusion

So there you have it, you now know which Cloudflare page rules to implement on your WordPress website. In the beginning, we said that the first three page rules were the most important. However, this depends on the type of website that you have, so essentially, not every site is going to have the same page rule settings which are quite evident when it comes to whether you have a standalone blog WordPress website or an eCommerce website.

This should give you a general idea of what you should be adding and how your website can be optimised with Cloudflare. If you've not used Cloudflare and want to know the benefits it can provide to your website, we would recommend reading this post: 4 Reasons to Use a CDN for WordPress

Remember though, in this tutorial, we have only gone through the Page Rules we can use to optimise your WordPress website, there are other rules in which we are going to list below:

Additional Cloudflare Tweaks To Improve WordPress Speed

Rocket Loader is a great additional to improve page speed. However, if you are using WP Rocket plugin, then it might not be beneficial to use this setting. What we would test this with GTMetrix and compare the statistics with both options (enabled/disabled).

If you have upgraded to Railgun, then this makes sure requests that cannot be served from Cloudflare's cache are still fast.

Hotlink Protection prevents people from copying/pasting images from your website to theirs (possibly resulting in bandwidth savings). Especially helpful for sites using high quality images or people who want to protect the images on their website.

What about if I'm using WP Rocket? What should I do then?

If you are using WP Rocket's amazing caching plugin, then you can add your Cloudflare credentials within the settings:

  • Global API key is found in your Cloudflare profile
  • Account email should be same email used in Cloudflare
  • Zone ID is found on the 'Overview' tab of your dashboard

Optimal Settings allows WP Rocket to configure your Cloudflare settings for better compatibility with their plugin. However, it also turns on email obfuscation (resulting in a GTmetrix error on every page) and disables Rocket Loader which may be useful for your site.

Fortunately, WP Rocket has recommendations for configuring Cloudflare such as:

  • Set Caching Level to 'Standard'.
  • Enable Auto Minify for JavaScript, CSS and HTML.
  • Disable Rocket Loader to prevent conflicts.
  • Set Browser Cache Expiration to '1 year'.

What do these Page Rules Terms mean?

  • Always Online - This means keeping a limited version of your site online if your server was to go down for any reason. This is usually used for your most important pages (eg. homepage, shop, contact page, etc...).
  • Browser Integrity Check - This attempts to deny spammers from accessing your website and challenges visitors with a suspicious user agent commonly used by abusive bots.
  • Browser Cache TTL - This time Cloudflare instructs a visitor's browser to cache a resource. You can increase this for pages that aren't updated frequently to save on bandwidth.
  • Disable Performance - This turns off auto minify, Rocket Loader, Mirage, and Polish. These are great to speed up pages, but they should be disabled for your WordPress Admin area.
  • Edge Cache TTL - This time Cloudflare's edge servers cache a resource before going to the origin server for a fresh copy. You can also increase this for pages not updated frequently.
  • Email Obfuscation - This prevents spam by hiding your email address to bots while remaining visible to visitors. You would only use this if your email address is publically displayed on your website
  • Enabling this on the contact page (and other pages showing your email) can help prevent spam.
  • Security Level - By using this, Cloudflare assigns IP addresses a threat score of 0-100. Page rules can be created to assign high security to WordPress admin and sensitive areas of your site.
  • Cache Level - The amount of caching done by Cloudflare ('everything' is most aggressive option for this).
  • Asterik (*) - This is used in page rule URLs to match certain parameters. For example, if I used silvawebdesigns.com/wp-admin* as my URL, then I set the security level to high, that means all URLs with that contain anything with /wp-admin/ would have a high security level.

Do you have any questions?

Here we answer some of the most commonly asked questions regarding the setup of Cloudflare.

What do asterisks do in page rules?
Asterisks serve as a wild card when using a URL in the page rule. For example, yourwebsite.com* would include any URL variation that comes after the asterisk. If you use *yourwebsite.com* as an example, this would include anything before or after, in this scenario, it would also include sub-domains.

What is best Page Rule for the WP Admin?
The WordPress Admin should have a page rule that enforces a high-security level, bypasses Cloudflare's cache, and disables apps + performance features in the admin area. Since WordPress security isn't the greatest since it's so commonly used these days, this would be one of the main priorities of our website.

How can page rules improve speed?
What Page Rules will do is help with decreasing the bandwidth used by the WP Upload area, set a higher Edge Cache TTL and it will cache any dynamic content with the right page rules. On the other hand, if you are looking to simply improve your page speed results (i.e. GTMetrix), configuring Cloudflare's speed tab in the options dashboard is the way forward.

How can page rules improve security?
With Page Rules, this can force SSL, forward XMLRPC URL requests, and lets you use email obfuscation (to prevent spam bots from collecting your email) on single pages without having to worry about an email-decode error showing up in GTmetrix for your entire site.

How many page rules can I have?
You can add up to 3-page rules on Cloudflare's free plan, it will then cost you $5/month for 5 more rules. You can find out more about Cloudflare's pricing if you do wish to upgrade here.


And that finally wraps everything up! If you have any questions about these page rules then don't hesitate to get in touch, we'd love to help you. If you have any better implementations, then we are all ears, let us know.

Drop us a comment below if this has helped and as always; thanks for reading! 🙂

Nathan da Silva - Profile

Posted by: Nathan da Silva

Nathan is the Founder of Silva Web Designs. He is passionate about web development, website design and basically anything digital related. His main expertise is with WordPress, Magento, Shopify as well of many other frameworks. Whether you need responsive design, SEO, speed optimisation or anything else in the world of digital then get in touch. If you would like to work with Nathan, simply drop him an email at [email protected]

It’s good to share

5 Ways How to Increase Organic Traffic Without Link Building

Link building is one of the most popular ways web owners use to rank their website on the SERPs. It’s also one of the fastest ways to bring Google’s penalty hammer down on your website.

Well, if link building is an effective organic SEO strategy, why ban people from doing it? And if it’s frowned upon by Google and other search engines, why do site owners still risk it? The answer to this has been shrouded in shades of grey for a long time.

Firstly, the problem with backlinking isn’t the act of linking itself; it’s the intent behind it. Websites that have a few links to their web pages from authoritative websites have a good thing going. However, too many web owners take it too far and try to drown their websites in links from unreputable sites, most of them paid for.

Of course, we get the drive; everyone wants to earn wealth and live an improved lifestyle. However, using shady backlinking practices to boost your site’s traffic and, potentially, revenue, is almost guaranteed to end in tears.

Buying backlinks is a massive risk, and it’s only a matter of time before Google’s algorithm fishes out your site. And you can be sure they will, and when they do, there’s no coming back. The introduction of ‘no-follow’ links was a response to spam links; these link types tell websites not to pass on their reputation or vote of confidence to the sites linking to them. Still, no-follow and do-follow links can be useful if you know how to use them, but that’s not the reason for this post.

The truth is that links are still a huge ranking factor for websites even though search engines see most of it as an SEO guideline violation. Thankfully, there are a ton of other ways to boost organic traffic to your website.

You could be building a POD platform like Teespring, an authority blog like Neil Patel, an essay writing service review outfit like Online Writers Rating, etc. Whatever the case, if you’re thinking of how to increase organic traffic, the strategies on this list will prove incredibly useful for you if you do them right.

Focus on researching and putting viral content.

There’s regular content, and there’s viral content. This doesn’t mean that regular content is terrible by any means; however, viral content, by definition, is the kind of content that “breaks the internet” and pulls waves of traffic to your website.

While almost anyone can write and publish regular blog posts, creating viral content is an art form. It takes a higher level of writing and research skills to put out such content successfully. Thankfully, there are few reliable resources online to help with that.

Authority resources like Buzzfeed, Upworthy, and many more have a ton of posts to guide you in writing articles that will go viral. Virality, beyond the writing and research, has to also do with having your finger on the pulse of trends, as they are more likely to go viral.

Another resource for gleaning viral trends is Buzzsumo. It is an established idea generator for insight into trends and trend-worthy article ideas. YouTube is also another great idea generator. You will come across many videos that inspire you about stuff that hasn’t been written about yet, or at least not exhaustively. Develop those and watch your organic traffic grow.

Long-tail keywords are your friend.

Can you make a guess how many people are trying to get traffic with “digital marketing,” “hair care,” “credit cards,” and more? Probably a lot more than you think. There’s just one problem; they’re all doing the wrong thing.

These keywords are generic and very likely oversaturated. Plus, they have no intent behind them anyway. However, imagine instead of trying to rank for “what are credit cards?” you build content around “what are the best no-fee credit cards in Ontario?” Do you see the difference? The latter keyword is a long-tail example and is something people are searching for. Everyone already knows what a credit card is, so no one is searching for the first keyword—unless they’re from another planet, which is just as unlikely.

Long-tail keywords typically have sizeable search volumes that grow with time, so it makes sense to use them to boost organic traffic. Just bear in mind, though, that ranking for long-tail keywords is almost always a long-term tactic. You’re almost guaranteed to see no impact on your traffic in the first few months. But if you’re patient, the results typically start to come in at the six-month mark and beyond, and it can affect your website traffic rankings significantly.

Don’t search for keywords. Search for questions.

Technically, they’re the same thing. However, when most people make a search engine query, they usually phrase them as questions. Quick usage of Google’s autocomplete feature will prove this assertion. What’s more? These are usually long-tail keywords, which means the competition and traffic potential is relatively low and high, respectively.

So how to get traffic to your website using these questions? You’d have to find them first, and besides Google’s autocomplete, places like Quora are a good place to start your search. A quick tip: if you find a Wikipedia page as a top result, the chances are that you may have a shot at the keyword in question. Wikipedia typically doesn’t employ too many backlinks, so showing up as a top result points to the fact that you may be able to rank for the same keyword without risky backlinking practices.

Tools like Moz are also great at sniffing out a page’s domain and page authority. You can use it to analyse the top five results to understand your chances for ranking for any given keyword better.

Niche topics are amazing.

There was a time before Google Panda when crash and burn website experts were raking in the big bucks. All they had to do was create a website and stuff it full of hundreds of pages talking about the exact same niche topic. Thankfully, those days are gone, and Google’s algorithm has prioritised content quality.

Still, this doesn’t negate the usefulness of niche topics. There are so many untapped or undersaturated niches out there that are just waiting to be discovered and optimised. Niche topic sites include gun blogs, health blogs, writing recommendation blogs like Best Writers Online, gaming blogs, gardening blogs, and more. Almost anything could be a niche topic.

So look out for niches that have relatively little competition, and you should be able to create content that will rank for said niche easily.

Look out for markets beyond your geographic location.

Don’t forget; the world is a global village. Just because you live in the United States does not mean you should only target US traffic. Consider creating content for a more global audience. Content marketing is relatively still unpopular in many parts of the world, keyword saturation is still a myth, and many of the practices circa Google Panda still work in these places.

The beautiful thing is that content for these regions can blow up nicely and boost your organic traffic without you needing to build a single backlink.

Final Words

Hopefully, you can see now that how to get traffic to your website isn’t always about link building. The strategies in this post are wonderful to get a superior rankings to boost that will, in turn, grow your organic traffic, risk-free.

Melissa Mauro is a self-improvement author who is always interested in new projects. She wants to create her own writer brand, that’s why Melissa is looking for fresh platforms for the implementation of her ideas. Creativity and unique style make it possible to deliver valuable and engaging content to her ideal reader.

Posted by: Melissa Mauro

Melissa Mauro is a self-improvement author who is always interested in new projects. She wants to create her own writer brand, that’s why Melissa is looking for fresh platforms for the implementation of her ideas. Creativity and unique style make it possible to deliver valuable and engaging content to her ideal reader.

It’s good to share

What To Put On Your Instagram Bio To Achieve Your Conversion Goals

If you’ve ever wondered what to put on your Insta bio, you aren’t alone. Many brands struggle with this question as they create their Instagram marketing strategy. They know their Instagram bio is important — but they aren’t quite sure how to optimise it to get conversions.

The good news: When you know what you need to include on your IG bio, it’s not difficult to create one that’s stellar. Keep reading to learn exactly what you need to include in your company’s Instagram bio so you can attract the right kind of followers and turn them into loyal, committed customers.

Bio Elements To Help Achieve Your Conversion Goals

Before you can decide what to put on your Insta bio, you need to make sure you have a business Instagram account. This type of account has several extra features that are helpful to brands marketing themselves on Instagram — you can promote posts, track how they’re performing, and more.

When your business account is set up, you can move on to creating your Instagram bio. You might be tempted to type out your slogan and leave it at that. But there’s a lot more that goes into a good Instagram bio than you might think. From a unique selling point to your location to a call to action, it’s essential to think carefully about every aspect of your bio so you can get as many conversions as possible. Here’s what you need to put on your Instagram bio to achieve your conversion goals.

Unique Selling Point

A unique selling point helps people understand what’s so special about your business. What does your company offer that other companies don’t? That’s the question a unique selling point aims to offer. USP’s have been around since advertising campaigns in the 1940s, and they’re the best way to differentiate your business and help you stand out in the marketplace.

Your USP is an integral part of your branding. This unique selling point isn’t just a slogan — it’s something that’s woven into the very fabric of your business. Your USP meets a specific need that your customers have. This means that before you can define your USP, you need to understand the needs that your ideal customer has.

In order to create a strong USP, you also need to make your product or service sound attractive. Your USP will help you achieve conversions by making your product sound irresistible to the people who need it most.

When you have your USP in place, it will help define your marketing strategy and efforts, including the marketing work you do on social media platforms. Your unique selling point isn’t meant to be a secret. You need to make your USP as widely visible as possible — and one great way to do this is by putting it in your Instagram bio.

Let’s take a look at a few examples, starting with So Delicious Dairy Free. There are a lot of ice cream brands out there, but So Delicious immediately explains what sets this brand apart: Its products are dairy-free, thoughtfully crafted, and aim to nourish your soul, not just your body.

NYX Professional Makeup also includes its value proposition in its Instagram bio. The cosmetics brand explains that it’s different from other makeup brands because it’s committed to being cruelty-free and inclusive.

Finally, another brand that does a great job including its USP on Instagram is Petco. This pet store tells you two things — it’s obsessed with pets and is committed to helping you care for your pets by not selling any food or treats that include artificial ingredients. For an audience of pet owners who want to give their pets the best care possible, this is a pretty convincing USP.

Location

Don’t forget to include your location in your Instagram bio! This is especially important for brands that have physical stores. You want potential customers to know where you’re located so they can stop by. And even if you don’t have a physical store, you can still add your location. Service-area businesses (where employees travel to customers to provide them with services) can still benefit from having a general location in their bio, even if you need to be vaguer (‘Chicago’) versus listing a specific street address.

How do you add your location in your Instagram bio? Edit your profile and choose ‘Contact options’. Then you can add the street address, city, and zip/post code of your business so your followers can see where you are on a map.

Even if you don’t have a physical address to share, you can still indicate your address in your bio. For example, Ponce City Market makes sure that its customers will know where to go. This company includes its location in its bio two additional times beyond the official address.

Here’s another example. La Monarca Bakery has 12 locations in the Los Angeles area. So instead of including just one address, the brand explains in its bio that it has 12 locations open in L.A. Then customers can find the one nearest to them by visiting the bakery’s website or shooting them a quick message.

Hey Groomer Mobile Pet Service is an example of a service-based business that includes its location in its Instagram bio. Since this mobile pet grooming service doesn’t have a storefront, it simply includes ‘Miami, Florida’ in its bio so that people in that location know to reach out.

Including your location in your Instagram bio is a smart way to get more conversions. Potential customers will have no doubt about where you are and where you will go.

Call To Action

When it comes to what to put on your Insta bio, a call to action (or CTA) is extremely important and shouldn’t be overlooked. CTA’s in your Instagram bio typically appear as a short phrase or sentence asking readers to do something (most commonly, to tap the link in your bio). If your Instagram bio has all of the other elements to be successful but doesn’t have a CTA, you’re missing out on a valuable opportunity to drive website traffic and get more sales.

What does this CTA look like? The Parks Project includes a line in its bio that says ‘Tap here to shop our Sunrise Collection’, encouraging followers to click the link and browse products.

Pottery Barn Kids tells profile visitors to click the link to shop its feed, with a handy arrow pointing toward the place to click.

Your CTA doesn’t always have to be explicit. If you’re running out of space in your bio, try using a simple emoji to do the trick. Pressed Juicery indirectly asks Instagram users to click the link in its bio by using an emoji that points toward the link.

You can also think about your CTA more broadly. You don’t have to ask people to click the link in your bio — if there’s something else you’d like them to do, include a CTA for that. For example, Dermae asks users to tag #dermae for a chance to be featured on its account.

When it comes to what kind of CTA to include in your bio, the sky’s the limit. This is an easy way to get more conversions from your Instagram marketing efforts.

Link

The final step to getting more conversions through Instagram is to include a link in your bio. Since your website is most likely where the conversions happen, you need to give your customers a way to get there.

Add a link to your website homepage, or if you have a specific product you’re promoting, to a product page or your online shop. You can also use a link in bio tool to promote multiple pages at once. Using a link in bio tool is a smart way to get around Instagram’s one-link-per-bio rule. With these tools, you can create a customised landing page that offers a list of links for your followers to choose from. You can choose which links to put at the top of your list and which links to prioritise by making that CTA button move to catch the eye.

Some tools alternatively give you the option to create a clickable Instagram grid so your followers can shop from your feed. Food brand Cappello’s uses this type of tool, allowing Instagram users to shop each individual photo.

Link in bio tools are a smart way to get more conversions — you can promote multiple sales at the same time and give your customers access to every link they need.

Optimise Your Instagram Bio

Now that you know what to put on your Insta bio, it’s time to create your own. Optimising your bio is one of the best ways to make a good impression on potential customers and encourage them to engage with your brand. Use these valuable tips to create an Instagram bio that will help you achieve your conversion goals!

Nathan da Silva - Profile

Posted by: Nathan da Silva

Nathan is the Founder of Silva Web Designs. He is passionate about web development, website design and basically anything digital related. His main expertise is with WordPress, Magento, Shopify as well of many other frameworks. Whether you need responsive design, SEO, speed optimisation or anything else in the world of digital then get in touch. If you would like to work with Nathan, simply drop him an email at [email protected]

It’s good to share

Install SSL Certificate on Apache via SSH

SSL stands for Secure Sockets Layer. It is used to secure the connection between internet browsers and Web server or websites by transferring the encrypted data rather than plain text. You can secure the HTTP connections by installing an SSL certificate. Installing an SSL certificate will allow for https:// connections instead of the standard http://. There are two types of certificates.

  • SSL certificate issued by the Certificate Authority (CA)
  • Self-Signed SSL certificate.

The main difference between these two types is that for a Self-Signed certificate, no third party is verifying the identity information of the Website and hence it is not trusted by any of the web browsers. So, accessing the website with self-signed SSL will prompt Untrusted Connection and you’ll have to Confirm Security exception manually. This is something users wouldn’t like to do. This is where SSL certificates verified by a CA comes into play. The CA verifies the website identity information and also provides CA Bundle (for browser compatibility). So these connections are accepted by almost all the browsers.

For installing SSL certificate (both types), we need to generate Private Key and CSR (Certificate signing request).

1) Generate Private Key On The Server

OpenSSL is the open source SSL package that comes along with most of the linux distros. Make sure openssl package is installed.

We are generating private key with openssl command as shown below.


openssl genrsa -des3 -out www.domain.com.key 2048

This will prompt a password, when you enter the passphrase and hit ‘Enter, the key file will be generated in the present working directory and the file name will be ‘www.domain.com.key’, where domain is name of the domain that you enter when the key is generated.

2) Generate Certificate Signing Request (CSR)

After generating your private key, you need to generate a CSR (Certificate Signing Request). You can easily create that with openssl command.


openssl req -new -key www.domain.com.key -out www.domain.com.csr

Few questions regarding the website identity will be asked and this will be checked by the certificate authority.

CSR will be generated in the present working directory with the file name ‘www.domain.com.csr’. Here is the screenshot of the CSR file.

3) Create SSL Certificate

After generating Private key and CSR, you need to create the SSL certificate. Now is where the difference come into play.

For a CA verified cerificate you need to provide CSR and Private key to the Certificate vendor. They will provide a CA verified certificate file (.crt file) and you can install it. But for a Self-Signed certificate, you need to generate the certificate manually.

Generating Self-Signed certificate
Certificate file will be generated with private key and CSR encoded in it. All the information for in the Private key and CSR will be encoded in the .crt file. Command is given below.


openssl x509 -req -days 365 -in www.domain.com.csr -signkey www.domain.com.key -out www.domain.com.crt

Certificate file will be generated in the present working directory as ‘www.domain.com.crt’, please note that domain.com is my domain name in this example and it should be replaced with the actual domain name. Here is the generated .crt file

To install this certificate for a website, you need to create a new VirtualHost for the domain name because SSL is using a different port and not the common port 80. SSL port is 443. So Apache will be listening to both 80 and 443 for the non-encrypted and encrypted data respectively. Or, you can create a separate conf file, in /etc/httpd/conf.d directory and then ask Apache to refer the said directory with ‘Include’ directive as shown below.


Include conf.d/*.conf

Now, add the below given code either in the VirtualHost or in the separate configuration (eg: ssl.conf) file created in the /etc/httpd/conf.d direcory.


SSLEngine on
SSLCertificateFile /path_of_crt_file/www.domain.com.crt
SSLCertificateKeyFile /path_of_key_file/www.domain.com.key

This will tell apache to refer the .crt (certificate) file and .key (Private key) file for SSL encrypted connection.

4) Restart Apache

Final step is to restart Apachge service for the changes to take effect.


/etc/init.d/httpd restart

You can verify the SSL setup by just loading your website with HTTPS, eg: https://domain.com

If your website is loading with https, be sure you have SSL installed for your website.

Of course, if that all looks too much then you can just get us to install your SSL certificate for you. Or you can purchase a low cost SSL certificate and we will do everything for you end to end. Simply contact us at

-->

It’s good to share

How to Disavow Links – The Ultimate Guide

As we know, in modern times, backlinks are one of the most important factors when it comes to SEO. In this tutorial, we are not talking about acquiring backlinks, but removing them. Why would we remove them? Well, not all links are good backlinks, bad backlinks come with a spam score which gives you a negative score or a spam score. This is why it is good to use the disavow tool to remove unwanted backlinks.

Unlike some other topics, Google has been quite forthcoming on their backlink disavowal views and its place in your search engine optimisation toolbox. Let’s look as to why you may want to disavow a link and how we go about using the tool.

Why Should We Disavow Links?

If your spam score is increasing or poor backlinks are having a negative effect on your SEO, then the disavow tool is what you should be using. If you receive a message from Google in your Webmaster Tools about having ‘Unnatural Links’, you are being penalised whether you are knowingly complicit or not.

An important role for any SEO export is addressing any Google penalties that may arise. It’s not an issue if you stick to white hat SEO, but knowing how to keep a clean backlink profile for your website is vital when it comes to your SEO strategy.

Another great tool you can use to monitor negative backlinks is Moz Domain Analysis tool. This will provide you with metrics for your PA (Page Authority), DA (Domain Authority) and a Spam Score which is all ranked between a value of 1-100. However, I am going to mention here that if we disavow links in Google’s Disavow Tool, it will not take effect on Moz. We are simply using this tool to check which links we could potentially disavow. Unfortunately, Moz doesn’t have any access as to which links have been disavowed from Google, so just within Moz; it won’t reduce your spam score.

We, unfortunately, will not detect that a link has been disavowed, this isn’t something we were able to include in the old or new DA models. Since disavowing links doesn’t actually remove a link, it just signals to Google that the links aren’t important, our crawler will continue to find the link. We are working on having something like that built into Link Explorer, where if you disavow a link in Google, you can also mark it as disavowed in Moz, but I’m afraid that option isn’t available yet.

So, what constitutes a healthy backlink profile?

In general, the majority of organic backlinks can be classified as ‘good backlinks’. They represent the ideal Internet that Google is looking for. One where the website has great content is referenced frequently, naturally and freely. A single backlink won’t really have a big impact on your website, but they will slowly be building a reputation for your website and acknowledge your website to be a trustworthy and authoritative source.

On the other hand, ‘bad backlinks’ are mostly non-organic (with few exceptions). Two of the biggest offenders of this are where people decide to purchase backlinks in a mass order from shady websites. This, and intentional backlink schemes which utilise a private backlink network (PBN). You can read more about this in this article: What do I need to know about buying backlinks in 2020?.

It is also possible to gain ‘organic’ links from spammy websites that are just lists of products and links with no real content. The link was most likely generated by a script/robot, and it is certainly not benefitting your site, in which case, these are the ones we want to be removing.

Negative SEO Attack

Just to add here, with disavowing backlinks, you have to be really careful. It is a serious action. that can significantly impact your search ranking, whether it’s for the better or worse, so be very careful with the links you wish to disavow.

Google considers it a pretty last-resort option. You can find this in Webmaster Tools > Advanced and there are 3 warning screens you have to click through before you can upload a disavowal file.

In general, you should only disavow a link that you know for sure is bringing you down. Check the Google Quality Guidelines for a more exhaustive list.

Note that a link from a low-traffic or low domain authority site is not a bad link. It probably won’t contribute much individually, but every link is a vote of confidence in your site that Google takes into account. Obviously high ranking PA / DA websites will increase your score quicker, but a thousand low-quality PA / DA websites linking to your website is also helping you as well.

So, What Happens When You Disavow a Backlink?

Essentially, this is a request for Google to ignore those links to your domain. If the link disavow is successful, it won’t be counted for or against you when determining ranking in the search results.

Do note, Google is not obligated to honour your request to disavow the links. If you have a look in their documentation, you will see that it is only a suggestion to disavow the links that you are requesting. Also, this can take up to 48 hours before it takes effect on your websites search ranking.

Is it possible to undo a link disavow?

It is indeed. You are always able to download your current disavow file and edit it to your requirements. I wouldn’t edit it too frequently, you really want to be sure that the links in which you want to disavow are the right ones and the ones you wish to stick with.

How to do I Disavow Links in Google Search Console?

If you have a Google Analytics tracking tag on your website then you will also have access to Google’s Search Console tool.

Within the Google Console, you can conduct a link audit from the Link Report page. Just click the Export External Links button on the top right of the screen then select the More Sample Links option. You can export this file if you wish as well.

Once you have determined which links you are sure you wish to disavow, you will need to create a text file (*.txt) which you can upload to the Google Disavow Tool. With this, you need to follow a specific format but it’s really simple. The format goes as follows:

  • Each entry has to be on a different link
  • Each entry needs to begin with domain:
  • You can name the filename of the text file as anything you like

Following these rules, here is an example of how you can block either a complete domain or just a specific page:


# Pages to disavow
https://spam-website.com/this-is-a-bad-link
https://spammy-website.com/oh-damn-another-bad-link

# Domains to disavow
domain:spam-website.com
domain:super-spammy.org

Blacklisting the entire domain will save you quite some time over blocking just a specific URL. However, there are few instances in which you would want to disavow a single link from a site but still allow other links from that domain, but that’s why the option is available.

Head over to the Google Disavow Tool, select your domain and click through all of the warnings prompts until you reach the dialogue box that allows you to browse your folders and choose a file to upload. Select the disavow file you already created, and select Open to upload it to the disavow tool.

Within the next day or so, Google will no longer take the listed domains into account when determining your pages’ ranking.

Conclusion

Disavowing links can be intimidating if you don’t know what to do, so be sure you know what you are doing before adding a list of domains/pages. You can potentially ruin your SEO if you abuse disavowing, so it’s crucial to get it right.

We think every site owner needs to know about disavowing. It can mean the difference between a clean link profile and a spammy one. Plus, there are so many benefits to disavowing links correctly. Why? Well, your website will most likely get a book in ranking which will result in a higher ranking and more traffic to your website.

As already mentioned though, you don’t want to disavow links if you can help it; a manual removal is always preferable.

Only disavow links when you have no other choice. In this case, don’t be afraid to use the Disavow Tool, at the end of the day, that’s why it’s readily available.

Hopefully, you have a better understanding of why you should disavow links and when you should do it. It can be a bit of a daunting task, but once you get used to it, it’s super easy.

Has the disavow tool ever helped you in the past? How much did it improve your rankings? Let us know! 🙂

Nathan da Silva - Profile

Posted by: Nathan da Silva

Nathan is the Founder of Silva Web Designs. He is passionate about web development, website design and basically anything digital related. His main expertise is with WordPress, Magento, Shopify as well of many other frameworks. Whether you need responsive design, SEO, speed optimisation or anything else in the world of digital then get in touch. If you would like to work with Nathan, simply drop him an email at [email protected]

It’s good to share

Pay Monthly Website Solutions

We have just made a ‘soft-launch’ of our pay monthly ‘sister’ website which goes by the name; Code Twenty Four, it’s quite far from the completion stage but it’s off to a great start. Let us know what you think!

So why did we start rolling out a new pay monthly website package? Well, quite often we do get requests for pay monthly solutions. This is also extremely beneficial for smaller businesses who quite simply can’t afford to pay the large upfront costs before a project commences.

How does it work?

Well, the process goes like so:

1. Choose your Package

Choose a website package that suits your requirements, if you’re not quite sure then talk to one of the members in our team and they will get a customised package tailored exactly to your requirement.

2. Design Concepts

The designers will start working on a fresh bespoke design that meets your requirements. Should they not get it right the first time, don’t worry, the designers will provide up to 3 revisions for any particular page to ensure your vision comes to life.

3. Website Build

Once the design is signed off, the development beings. They will start building your site and provide a development link where you can view the latest work carried out.

4. Go Live!

Once your new website is fully tested, secured and given the go-ahead, your website will become live to the whole world, Ohh yeahhh!

What’s even better, you can get a website up and running without paying anything as there are no up-front costs. The way we like to think about things is that a website should be like a phone contract. You get the phone, pay monthly and if you want all the bells and whistles then you can add ‘bolt-ons’ or ‘add-ons’ whenever you like.


What services do you get?

Everything! Well, exactly the same as what we offer with the only difference being that the payment scheme is different. As always, we offer the following services:

  • Brand design
  • Responsive design
  • CMS systems
  • E-commerce
  • Speed Optimisation
  • Social Media Marketing
  • Email campaigns
  • SEO (Search Engine Optimisation)
  • App development
  • Hosting
  • Website Security
  • Copywriting

Why did we do it?

Well, as we said before, we want to cater to everybody’s needs, if you are a start-up company looking to start selling online, then we have your back! Maybe you can’t afford the up-front costs and this is the perfect solution for you, and that’s why we created Code Twenty Four.

Conclusion

We created Code Twenty Four because we already had a lot of interest in this method of paying for a website. We fully understand why some businesses would prefer to do this and so we created this solution and provide our new service where you can benefit from:

  • Having a modern/clean Website Design by an experienced team of web designers.
  • Providing no hidden charges or increases.
  • No Upfront Costs
  • Free, unlimited updates.
  • 7 Days a week customer support.
  • A fully Secure website. (SSL / https)
  • A Mobile Friendly, Responsive Website. (Which will look pixel-perfect on on desktop, tablet and mobile)
  • Super-fast Website Hosting.
  • A Content Management System to edit your website content and images.

Let us know what you think to pay monthly solutions and the new website we have created!

Nathan da Silva - Profile

Posted by: Nathan da Silva

Nathan is the Founder of Silva Web Designs. He is passionate about web development, website design and basically anything digital related. His main expertise is with WordPress, Magento, Shopify as well of many other frameworks. Whether you need responsive design, SEO, speed optimisation or anything else in the world of digital then get in touch. If you would like to work with Nathan, simply drop him an email at [email protected]

It’s good to share

What do I need to know about buying backlinks in 2020?

Have you ever had someone say to you that they’ve had a sudden decrease in rankings on their website? Everyone wants to rank on the first page for certain keywords on the search engine results page (SERP), and if you’ve been there before, you’ll realize how impactful this is on traffic and revenue coming through to your website.

There are various reasons why this could happen, more commonly because of the use of low-quality or bad backlinks which can be considered as spam. These low-quality links are usually caused by website owners purchasing backlinks for their website. If you consider the amount of time, effort and money people have to make to build website traffic, a sudden negative impact due to poor backlinks is known to disrupt the operation of their business, both mentally and financially for that matter.

Before we get started, we’re going to mention a great tool we like to use to monitor the performance of our website called Moz. You can also download their browser extension here which makes it easier to analyse your domain. What this will do is provide a score for your domain, and you will receive a score for DA, PA and it will give you a spam score as well.

PA & DA stand for Page Authority and Domain Authority, respectively. These are numeric metrics which are both scored out of 100. Quite often, they are used to see how search engines see the strength of backlinks by a particular URL. However, Google does not use PA or DA as ranking methods. From this information though, you can think of Moz as a third-party metrics to get an idea of what your PageRank might look like.

There are similar link-graph metrics from other third-party tools as well such as:

  • Ahrefs DR & UR
  • Domain Rating & URL Rating
  • Majestics TF & CF
  • TrustFlow & CitationFlow

Which ones should you use? Well, it all comes down to personal preference. Some SEO’s use these metrics, some don’t trust link-graph metrics at all or think they’re useless.

Personally, we prefer Ahrefs & Majestic metrics as we’ve worked at a web-host previously and got a taste for the scale of different bots in practice. Moz doesn’t actually crawl that much. Ahrefs & Majestic have a much larger crawl infrastructure, so I would trust them more… But again, the choice is down to you!

You can read more about Link Building on one of our previous posts here.

So, what are backlinks anyway?

Backlinks are links from one website to another website. Google and other search engines consider backlinks as a vote of authenticity. Basically, if another website links to you, it means that website is giving you a ‘vote of trust’. If you have thousands of backlinks linking to your website, this ‘could’ increase the worth of the page in the eyes of any search engine. One thing you have to understand though is that a few backlinks from reputable websites are going to be worth a lot more than several links from unreputable sites. How can we determine if a site is reputable? Well as we mentioned before, we can use certain tools to get an idea of how ‘worthy’ a backlink is.

Let’s look at Facebook for example:

Facebook, at the time of writing, has a PA of 100 (perfecto!), DA of 96, 1,601,867,432 backlinks, and only a spam score of 1%. Pretty incredible right?

Since the advent of the search engines and the rise of Google, backlinks have remained instrumental in ranking the website. Despite regular changes in the search engine algorithms, experts still regard the number of quality backlinks as one of the most important factors of any website.

That being said, how did backlinks become of such great importance for a website? It wasn’t always like this but here is a bit of history of backlinks.

History of Backlinks

Once upon a time ago, search engines such as AskJeeves, Yahoo, AOL, Excite amongst a few others dominated the search landscape. For this reason, developers of search engines were always trying to find a way in which they could rank websites based on the quality of information that appeared on that website. In the beginning, search engines started by ranking websites based on the number of keywords for the particular website.

In these golden days, whenever somebody searched for something, the top-ranking web page on search engines would be pages that had the most relevant keywords to that particular keyword search. Let’s say you were to search for ‘Website Design’, this would normally rank a web page based on the number of keywords for Web Design and similar terms for a particular page. What was the issue? Well, this method of ranking led developers to do things such as ‘keyword stuffing’. What this means is that a lot of people would then stuff their website with hundreds of keywords on every page of their website just to gain priority (or even a top-ranking) on search engines for a particular search term. What this meant was that a lot of low-quality websites would start appearing quite highly ranked up even if the website wasn’t even about that given topic, crazy right?

Not only that, but to keep the look and feel of a web site, people were actually hiding these keywords, placing them off then screen (let’s say like 9999px to the left) just to get a higher ranking and increased web traffic.

This is when Google first came into the spotlight back in 1998. Their objective was to change how the information is presented to the user in search listings. Over time, Google started putting more emphasis on backlinks instead of keywords. Why? Well, there were obviously some flaws before and it’s not quite as easy to ask other website owners to link to a particular page on your website; because of this, the number of backlinks was a preferred method over keywords and other factors.

It was a very wise thing to do: if you look at big companies such as Apple, Amazon, CNN, and so on, they all had one major thing in common. They had thousands of backlinks. For Google, this was an awesome start as people were then shown information from authority websites that had more relevant information based on their search terms.

Over time, people were also finding ‘hacks’ to gain a higher ranking. This resulted in business owners purchasing backlinks from companies that could sell them in bulk and given more emphasis on keywords (with a link to their website) that people wanted to rank for. As a result of this, it once again became possible to climb up the search engine ladder and reaching the top page once again quite easily.

This obviously challenged Google to address another problem. What they did to tackle this was that they started blacklisting websites, decreasing their rank, or in more severe situations, just outright blocking the website. This meant a lot of websites suddenly noticed a massive decrease in their rankings.

For some websites, it meant losing thousands of pounds of revenue for just a single day’s operation. For others, it meant restarting their journey from the ground up again. Even today, Google and other prominent search engines are proactively changing their search algorithms to stop website owners from dodging the system.

To get a better idea of how you can rank for SEO these days, you should check out this blog post: The 4 best SEO practices for 2020 and beyond.

So, should I buy Backlinks?

Well, since Google and many other search engines now put a huge emphasis on backlinks, it’s quite obvious that many website owners will feel pressured into buying them.

They continue to buy them because it saves a lot of time and helps them get results faster. But there is a problem here…

Google and other search engines will eventually catch on to this, which can lead to their website becoming penalised which can even lead to established websites becoming bankrupt because of this.

Google does have some clear guidelines regarding link-building schemes. These guidelines are regularly published by SEO companies and experts who try to steer people away from buying backlinks. Unfortunately, some website owners don’t take any notice of this, assuming it’s just a quick and easy (lazy) way of attracting customers to their website.

What you should look into prior to this is having a look at what Google says about buying backlinks (aka; link schemes):

Buying or selling links that pass PageRank. This includes exchanging money for links, or posts that contain links; exchanging goods or services for links, or sending someone a “free” product in exchange for them writing about it and including a link.

Funnily enough, one of the first things Google mentions in its guidelines is related to link schemes and how it prohibits the buying and selling of backlinks. Exchanging money for links and generating excessive link exchanges where other sites link to others in an excessive manner is banned by Google. Should they discover that you have bought links, it WILL severely penalise your website without any given reason. Google is very clever, don’t underestimate their power.

If this does happen to you, you’ll be stuck in a pretty sticky predicament: you will be left without anywhere to go. The only thing you can do is to contact Google using their webmaster tool and hope that there will be a reply. While Google is not known for replying to such queries, consider yourself lucky if your website is restored at a later stage… very lucky indeed!

Don’t just take our word for it though, a quick Google search will tell you that most people lose their minds trying to recover from such a situation. The most practical way of recovering from this is to remove the ‘bad links’, which will ultimately restore the website a lot quicker. If you don’t have access to websites to remove such links, you should consider using Google’s Disavow Links Tool. What this does is allow publishers to tell Google that they don’t want certain links from external sites to be considered as part of Google’s system of counting links to rank web sites. It can take some time to recover, but this tool can be very useful. We spoke about Moz earlier and how they provide a spam score by a percentage. With this tool, you can actually see which links are considered spam. From this, we can include such links into our Disavow tool to regain the desired 0% spam score.

To get a general idea of how impactful search engines like Google take automated backlinks into account, they actually initiate nearly 400,000 manual actions every month. Besides the manual actions, thousands of websites are often hit by Google algorithm updates, which are specifically designed to stop low-quality websites from popping up in the search radar.

What can we take from this? Well, for any website owner, it is clear that search engines are actively monitoring low-quality backlinks and spam issues, so you always need to be very careful! Buying backlinks is an easy strategy, but is it worth the risk? Definitely not.

Here are some truths about Backlinks…

Google and other search engines DO NOT promote link building. If you’re quite new to the wonderful world of SEO, this surely might sound confusing or contradictive as backlinking is widely used to promote the SEO of your website.

Despite what Google suggests, SEO experts have proven that backlinks are a huge factor in terms of higher page rank. Over time, this fact has been proven on a variety of occasions. For this reason, if someone tells you that link building is not important, they probably don’t have a clue what they are talking about.

Nothing against Google, and if the truth were told, if I were one of Google’s executives or the owner of an emerging search engine, I would probably do the same. After all, there are reasons to continue focusing on all aspects of your website, which DO include high-quality backlinks.

What should we focus on then?

High Quality Backlinks

Backlinks are not always ‘equal’. What we mean here is that there are high-quality backlinks and low-quality. A high-quality backlink is most likely a link from an authoritative and legitimate website. However, a low-quality backlink is generally one from a low ranked website, a spam website, or a link exchange.

What we mean here is that you should not simply focus on the total number of backlinks you have, but instead, focus on the overall quality of backlinks you have. For example, we would rather have 10 quality backlinks (where the PA and DA are very high), than 200 low-quality backlinks from unreputable or spammy websites. In fact, these low-quality backlinks can actually lower your overall ranking on SERP’s.

Let’s cover: Do Follow vs. No Follow Links

What is a do follow link?

A do follow Link is a hyperlink that is able to tell all search engines to pass along its page rank influence to an outbound link.

What is a no follow link?

A no follow link is exactly the opposite of a do follow link. It is a hyperlink that removes the ability to pass on its page rank status to other sites.

Before we begin setting up a link building campaign, it’s very important to understand the difference between a follow and a no follow link.

A history of backlinks suggests that a “do follow” link is the most important type of link because it will count towards the authority of your website. On the other side, search engines will only give link juice to a “do follow” link. For this reason, if you have let’s say; 100 links that have a “do follow” attribute, search engines will count all of those 100 links towards the ranking of that particular page. However, if you have backlinks with a “no follow” attribute, it will simply ignore or give very little juice to an external website. You can read more about this from Neil Patel on his blog post: Should You Waste Time and Money on Nofollow Links? Here’s a Final Answer. When it comes to SEO, he’s the guru, we 100% recommend subscribing to his mail list, you will gather so much knowledge on SEO it’s unreal.

Anyway, the “no follow” attribute was originally introduced to reduce spam. How? Well, before lots of people were deliberately leaving links on Wiki pages, and high valued forums that quite often pointed to low-quality websites. These links did not help the web site owners because you could literally go to a lot of websites and add a backlink to your own website.

The solution to this was creating the ‘no follow’ attribute. You can read more about this here. These days, most websites such as Wikipedia, forums, and any commenting system will only allow “no follow” links that do not count as votes. As a result, it helps fight spam.

If these “no follow” links don’t count towards the ranking of a website, then what’s the point in building such links right?

While “no follow” links may not count as a vote, it can definitely get you a lot of traffic if placed on an authoritative website. With the passage of time, hundreds of these no follow links can build a regular stream of referral traffic that will continue non-stop. It means that you will not need to put any effort after placing the link because the link will bring automatic traffic to your website.

In the end, it always helps to build a mix of no follow and do follow links. Even when you’re using an SEO professional to rank your website, ask them if they know about the proportion of do follow vs. no follow links.

To conclude…

We have discovered now how search engines have evolved over time. In the olden days, we could rank for websites using a variety of keywords to gain traffic, stuff pages with thousands of keywords, and gain a load of traffic. Well, we can’t do that anymore, search engines have evolved and we now know that we can get severely penalised for using such techniques. This can go the same way with buying backlinks… If you get caught, you will get penalised, it’s as simple as that. There are cases where people have been fortunate to get away with it and even try and find loopholes around this to make it safer, we won’t mention these techniques but you have to ask yourself the question: Is it worth it?

For us, we’re pretty happy gaining backlinks from websites we build (i.e. by adding Developed by Silva Web Designs in the footer) and if anyone mentions one of our articles in their blog posts, then awesome! We tend to concentrate on several other SEO techniques as well. We won’t list them all but this includes SEO best practices, like only having one h1 tag per page, optimising all images and improving the page speed which can be checked using tools such as Google’s PageSpeed Insights or another one we like to use; GTMetrix.

There are other ways to gain traffic depending on your budget as well such as Google Ads. How this works is like an auction system: Google Ads, aka Google AdWords, is Google’s advertising system in which advertisers bid on certain keywords in order for their clickable ads to appear in Google’s search results. Since advertisers have to pay for these clicks, this is how Google makes money from search. Not only Google of course, but generating traffic to your website may also gain revenue, so if the ROI (return of investment) is worth it, it’s definitely a great tool to use.

If you have any questions in regards to this article, feel free to get in touch or leave a comment below.

Nathan da Silva - Profile

Posted by: Nathan da Silva

Nathan is the Founder of Silva Web Designs. He is passionate about web development, website design and basically anything digital related. His main expertise is with WordPress, Magento, Shopify as well of many other frameworks. Whether you need responsive design, SEO, speed optimisation or anything else in the world of digital then get in touch. If you would like to work with Nathan, simply drop him an email at [email protected]

It’s good to share

How to Configure VirtualHosts in XAMPP on a Mac

We used to be a massive fan of AMPPS but since upgrading to macOS Catalina, it wasn’t possible to use this application anymore. I’m sure they will update it again in future but the issue is that it’s a 32 bit application, it’s no longer compatible on the latest OS. So that’s when we made the switch to XAMPP. It’s not as straight forward to use as AMPPS, but it’s probably one of the better applications for localhost development.

Installing was a breeze, but things became a bit more complicated when setting up Apache VirtualHosts. So here are a few steps we took to get everything running as we wanted it. So firsts things first…

What is VirtualHosts?

Okay, so we’re going to explain what we wanted to change in order to get out localhost set up exactly how we wanted it…

VirtualHosts allow Apache to map a hostname to a directory on the filesystem. You can set up as many VirtualHosts as you need so that each website operates under its own hostname. For example, you might want to map yoursite.silva to /Users/myusername/yoursite. To test your development site all you would need to do is visit “http://yoursite.silva” into your browser’s address bar. By default, it would be set up as http://localhost/yoursite.

So how do we make this change?

How to Enable VirtualHosts

Firstly, you’ll need to open the following file; /Applications/XAMPP/xamppfiles/etc/httpd.conf in your preferred text editor. An easy way to get here is by going to ‘Finder’ –> ‘Go’ –> ‘Go to Folder’ and then simply pasting in the location.

Now that you have the file open, you will need to look for these lines;


# Virtual hosts
#Include /Applications/XAMPP/etc/extra/httpd-vhosts.conf

Now, uncomment the second line by removing the hash (#) so that Apache will load your customised VirtualHosts files as follows:


# Virtual hosts
Include /Applications/XAMPP/etc/extra/httpd-vhosts.conf

Let’s create your VirtualHosts

Lets now open the following file; /Applications/XAMPP/xamppfiles/etc/extra/httpd-vhosts.conf. Towards the bottom of the file you will see some example VirtualHosts, which you should comment out or delete.

At the bottom of the file, add ‘localhost’ as the default named VirtualHost like so:


# localhost
<VirtualHost *:80>
    ServerName localhost
    DocumentRoot "/Applications/XAMPP/xamppfiles/htdocs"
    <Directory "/Applications/XAMPP/xamppfiles/htdocs">
        Options Indexes FollowSymLinks Includes execCGI
        AllowOverride All
        Require all granted
    </Directory>
</VirtualHost>

What this does is allows http://localhost to point at XAMMP’s htdocs directory once we have created our VirtualHosts.

With this done, we can now create our own VirtualHosts. So, after the default localhost, we can now add:


# My Custom Host for 'yoursite.silva'
<VirtualHost *:80>
    ServerName yoursite.silva
    DocumentRoot "/applications/XAMPP/xamppfiles/htdocs/yoursite.silva"
    <Directory "/applications/XAMPP/xamppfiles/htdocs/yoursite.silva">
        Options Indexes FollowSymLinks Includes execCGI
        AllowOverride All
        Require all granted
    </Directory>
</VirtualHost>

With the example above, you should replace yoursite.silva with your own hostname, this can be pretty much anything you like. The exception is using a hostname that will not conflict with a real domain name, like yoursite.dev. We used to use yoursite.dev but since it became owned by Google we can no longer use this along with quite a few others.

Now we have another step to do to get this working fully…

Edit your hosts file

Now we need to go to the following location; /etc/hosts, we will be editing the hosts file so it knows how to handle the new server name. The hosts file is used by OS C to map hostnames to IP addresses. In our case, since we are going to be using localhosts, we want to map the server name to the IP address 127.0.0.1 (this is your localhost IP).

Pro Tip: You can actually map any website to any IP address. We tend to do this for website migrations, let’s say we moved silvawebdesigns.com to a new IP address, we could simply add 192.123.123.124 silvawebdesigns.com (add www. if your website contact this in the web address). The advantage of this is that we can test the website on the new server before updating the Nameservers or DNS, this way we can be 100% the site will work perfectly before the migration.

So now that we’ve mapped the server name to your localhost, the next step is…

Restart Apache

When updating any of these files, for the changes to take effect, we need to restart Apache. This can be done through the XAMPP Control found here: /Applications/XAMPP/XAMPP Control.app

Now, let’s point your browser to http://yoursite.silva (or whichever server name you chose) and you should see your website.

If it’s all working now, then happy days! But should you have a problem like a 403 error, then see below:

Oh no, I have a 403 error!

Since Apache runs as a ‘nobody’ user by default, it may not have the permissions required to browse your OS X user directory or it’s sub-directories. In this case scenario, you’ll receive a 403 ‘access forbidden’ error when attempting to view your site within your localhost. In other cases, you may see that you can view the site but you get a few PHP errors when attempting to write files or create directories within your filesystem.

So, in order to fix this, you can configure Apache to run as your OS X user. To do this, open the following file; /Applications/XAMPP/xamppfiles/etc/httpd.conf and look for the following lines:


# User/Group: The name (or #number) of the user/group to run httpd as.
# It is usually good practice to create a dedicated user and group for
# running httpd, as with most system services.
#
User daemon
Group daemon

Now, change User to your OS X username, and save the file:


User yourusername

Restart Apache again and you should now be able to navigate your site without any issues, including manipulating files and folders using PHP.

Should you have any further problems, then try setting your user read and write privileges on the following file; /Applications/XAMPP/xamppfiles/htdocs/xampp/lang.tmp

To conclude

Hopefully, now you should have your custom domain setup via localhosts on OS X. Like we have already mentioned, it’s not simple to add custom domains as it was with AMPPS, but once you’ve completed the initial setup, all you have to do is update your /Applications/XAMPP/xamppfiles/etc/extra/httpd-vhosts.conf file with your new domain by copying and amending one you added previous, then add 127.0.0.1 yournewdomain.silva to your /etc/hosts files and you’re are all set!

If you need any assistance or fall into any issues in setting this up, drop us a comment below and we’d love to help you with this!

Nathan da Silva - Profile

Posted by: Nathan da Silva

Nathan is the Founder of Silva Web Designs. He is passionate about web development, website design and basically anything digital related. His main expertise is with WordPress, Magento, Shopify as well of many other frameworks. Whether you need responsive design, SEO, speed optimisation or anything else in the world of digital then get in touch. If you would like to work with Nathan, simply drop him an email at [email protected]

It’s good to share