Digital Uncertainty: Is There a ‘Beyond’ to the Cloud?

Upon being first introduced into public servers – and not that long ago, really – cloud storage and cloud computing was a mysterious entity entirely foreign to the general public, but oh boy did its advantages ever become readily apparent. Cloud storage and computing quickly made significant changes to the way people put their computing devices to work in every way possible, as well as redefining the way companies do business.

However, as is always the way in the digital world, the same question is always ready to go. “What is next in the evolution of personal and business computing?”

Here at 4GoodHosting, we have cemented ourselves as a leading Canadian web hosting provider with affordable rates. Behind all of us here is a genuine interest in developments in the E-world and all that goes along with it. Anyone who’s been as curious as we’ve been over the past decade plus will now that nothing stays ‘new’ for long, and that the engines driving technological advances in computing aren’t one to rest on their laurels.

Let’s have a look at what’s the general consensus on what’s next – if anything – after the Cloud.

Wholesale Changes Aplenty

The cloud has been much more than just a place to store and access data, it has indisputably been an opportunity for growth in the IT world. This has been especially true for people beginning to understand the benefits of mobile business. If we were to list them all we’d fill your screen 20 times over, so let’s look only at the main developments:

  1. Company Data Instantly Accessible From Anywhere

Companies have been trying to figure out more cost effective ways to do their work since the beginning of commerce itself hundreds of years ago. The Cloud has made it feasible to work from almost anywhere, provided there’s an Internet connection. Having an employee or contractor working remotely saves any company quite a bit in overhead costs. It also allows employees to set their own pace and often motivates people to increase their own leisure time by performing their work more efficiently. Without the cloud, only companies that that could afford multi-million dollar servers and IT departments would have been capable of offering that working arrangement for people.

  1. Greater Numbers of the World’s Devices Connected to Each Other

Estimates from 17 years ago had 200 million individual devices connected to the internet. While at first glance that may seem like a large number, it’s not when you weigh the number of people in the world. Now there are an estimated 10 billion devices connected to the internet at some point of every day, and being able to do so with a data storage and sharing system, like the cloud, makes for much more opportunity for world-wide growth and potential innovation.

  1. The Changing Landscape of Business

One thing the Cloud can indisputably boast is having superior data uploading and downloading capabilities. It has made it possible to purchase files on-the-go, and from a sales perspective the ability to tap into the impulses of the market has been wholly revolutionary. It is more important than ever to now connect with the mobile generation and make downloading content quick and easy. The cloud has been what’s allowed that to happen.

The Reach of Web Technology Capable

1899 – well over 100 years ago – saw a patent official state that ‘everything that can be invented, has been invented.’ Asides from being entirely wrong, it’s an amusing anecdote, especially for young people who’ve been right on the front line for the digital technology explosion of the early 21st century. We’ve of course seen gigantic leaps in technology around the planet since. And believe it, we’re not even near done with these advancements.

Future Cloud Computing Models?

No doubt the cloud has been massively successful and continues to be integrated into numerous business models and services, with IT engineers and designers trying to figure out ways to improve its systems. Some improvements have been necessitated by modern usage trends, while others hope to cross into new realms. Some of these aims are as follows:

Increasing Security – The security of the public cloud is one of the biggest concerns associated with it. As more people begin sharing information, what measures are in place to prevent other people from accessing private information? It’s a question that the majority of businesses and individuals take seriously, and cloud designers strive to improve current security options for the growth of future cloud use and expansion. Hybrid cloud configurations are one of the most interesting ways to make security a non-issue, combining a physical server with a private cloud model and restricting them to specific use only.

Expanding Applications – Further, cloud compatible CAD programs are essential to countless industries and companies that produce designs or products. The issue, however, is that many CAD files are too large to be compatible with most mobile devices. However, changes to the way CAD interacts with the cloud is hoping to put an end to this incompatibility, making it possible to work on-site or at the client’s location.

With smart phones and tablets, applications are everything and quite literally drive the manufacturing and marketing industries around them. Changing the way apps interact with the cloud is something that many vendors have been looking long and hard at, and the next generation of cloud-based apps is likely to arrive soon.

Cloud-based businesses – First and foremost here is something known as outservicing. Many of you have heard of outsourcing, but the cloud is in the process of making outservicing a household name too. There are a few aspects of company work that are not unique to individual businesses, and the best example of this is Human Resources. By outservicing HR data to cloud-based companies, small companies are much more likely to receive first-rate HR services for more affordable prices.

Cloud sensor spots – We live in an ever-more digital and impersonal world, but some companies are trying to bring back personal communication. As much as that’s possible. A cloud sensor spot is a field where the cloud interacts with a mobile device in cool and exciting ways. For example, it could give people passing by a certain city spot an opportunity to take advantage of a special deal, or to learn more about a product or service. They’d function in a similar way to Wi-Fi, but with custom content that’s been chosen and uploaded by the owner of the cloud-spot.

Keep an eye out for this one!

The Big & Small

We’ll all surely agree that the Cloud has been well received, and for good reason. But it is likely not the be all-end-all of web connectively solutions, either now and definitely not in the future. However, until whatever’s next materializes itself and is then embraced on a global scale the way Cloud Computing has been, here we are.

Look for the cloud to continue to be a major focus for vendors, large and small businesses, and the general public. As greater numbers of people on the web for specific purposes (and not just web browsing for entertainment or personal research) experiment and share their ideas through data storage systems like the cloud, the faster technology will be pushed to progress.

Waiting for these new developments to come will be a challenge, and again particularly for those like us who love a new digital wrinkle much more than most. It’s all too easy for us to be looking to the horizon, but when you really weigh the value of what we currently have with the cloud it’s really quite easy to be super appreciative.


What’s in a ‘Tweet’: Understanding the Engagement-Focused Nature of Twitter’s Algorithm

It would seem that of all the social media platforms, Twitter is the one that businesses struggle with most in understanding just how to harness it for effective promotional means. The common assumption is any shortcomings are related to your use of the ever-ubiquitous #hashtag, but in fact they’re not nearly as pivotal as you might think.

Here at 4GoodHosting, we’ve done well in establishing ourselves as a premier Canadian web hosting provider and a part of that is sharing insights on how to get more out of your online marketing efforts. Social media is of course a big part of that, and as such we think more than a few of you will welcome tips on how to ‘up’ your Twitter game.

It’s easy to forget that these social media platforms have algorithms working behind them, and working quite extensively. What’s going on behind the screen controls and narrows down what you actually see on your timeline.

For example, let’s say you have specific political affiliations. The algorithms ensure that the majority of the tweets you’ll see will be linked to that party’s views. Or perhaps you’re especially into sports. If so, plenty of sports news sources will be all over your timeline. Oppositely, if you dislike something then that theme will slowly end up disappearing over the course of the week or beyond.

All of this is a reflection of ALL social media platforms, Twitter included, are using more and more complex algorithms to satisfy their user base and deliver content they are likely to find favourable.

So this is what do you’ll need to know about Twitter’s algorithms, and the best ways to use them to your advantage.

Keep Your Eyes Peeled For These

There’s no disputing the fact that Twitter has faded quite considerably in popularity and the strength of its reach. Despite this, Twitter is really narrowing its scope of engagement and a key way to increase engagement is through increasing relevance of the posts seen.

Directly from Twitter’s engineering blog, here are a few of the factors that decide whether a Tweet is sufficiently engaging and thus worthy of ‘appearances’

  • The level of recency to your posts, the likes, retweets, and other things such as attached media
  • Whether you have previously liked, or retweeted the author of the tweet
  • Your previous positive interaction with certain types of tweets

Twitter will then recommend people to like over the next couple of days. Depending on your responses to those recommendations, it will then adjust the content that’s seen by you to better reflect how it is gauging your preferences.

What’s easy to conclude is that users themselves play a predominant factor in what’s going to be seen on their timelines. Liking or using the “I don’t like this” button once or twice goes a long way in this regard.

By this point it begs asking the question; is Twitter’s algorithm perhaps a little too simple? It is definitely not as complex as other media platforms such as Facebook, but the benefit in that is that it is easier to manipulate. Among the many benefits of this is the way that smaller companies may tag a random brand or company in a tweet that is completely non-associable with their tags. Twitter’s algorithms allow this to be a very effective means of getting increased exposure.

Gain Your Advantage

Generating engagement with your tweets is a reliable way to boost exposure and put yourself on top of the algorithm game. Engaging your audience and boosting exposure keeps you ‘in’ the talk and seeing to it you’re using the correct hashtags will ensure you’re being talked about.

Smaller companies can benefit from tagging large companies in their tweets to gain exposure, and that’s especially advisable if the company relates to what you’re talking about. Sure, it only works to a certain degree, but gaining followers by any means possible is always a plus.

Putting all this talk about engagement into perspective, it’s important to understand how to spark the right sorts of conversation. Asking random questions will make it look forced, while if you don’t interact at all you may see a dip in exposure. Find a way to be genuine in your responses, and adhere faithfully to what you’ve defined as your brand’s voice.

New WPA2 Wi-Fi Protocol Security Flaw Warning

This past week has seen an explosion of cautions extended to people using home Wi-Fi networks (which of course is pretty much ALL of us) regarding a security risk that makes private information and personal content increasingly vulnerable to theft or misuse. It’s certainly not the first time such an issue has come to the attention of the digital world, and it won’t be the last. This one, however, is particularly noteworthy given the fact that it has such far-reaching and widespread potentially negative implications for anyone who’s on the web via a Wi-Fi connection – at home or elsewhere.

Here at 4GoodHosting, we strive to be on top of trends and developments in the industry to go along with being a premier Canadian web hosting provider. This ‘heads up’ should be especially welcome for business owners operating an e-commerce website, but we imagine it’s going to also be well received by your average web browsing guy or gal as well.

Malevolence from your Modem?

Credit for catching this new flaw goes to a team of Belgian researchers. They’re the ones who recently discovered a security vulnerability in the WPA2 protocol. The WPA2 protocol is a system of rules that dictate how your Wi-Fi networks function and behave. As mentioned, it’s a near ubiquitous and wide-reaching ‘standard’ – it’s installed and in use with almost every single modern Wi-Fi modem or router. We’re going to go ahead and assume that includes you, and as such this warning is one you’ll want to take note of and follow the precautionary measures we’ll lay out here.

The research has indicated that there’s a loophole in the WPA2 rules that’s creating the possibility for hackers to tap into a Wi-Fi network and grab sensitive information that’s being relayed back and forth over it, with one example (and likely the most disconcerting of all the possibilities) being stealing your credit card details when enter them in the process of buying something online. Another possibility could be snagging your password when you enter it into the login for a particular website.

Here is a good read on the issue in detail, via the official website.

Who’s Most at Risk?

Plain and simple, the answer here is as suggested above. Meaning pretty much everyone is at risk. That’s because WPA2 is the most common protocol utilized with Wi-Fi modems and routers these days, and has been for quite a while.

We’ll also go ahead and assume the majority of you are relatively computer savvy, but for those of you who aren’t you can easily determine if that’s the case by looking under computer>system preferences>network connections and then have a look at your Wi-Fi network settings. It’s nearly certain you’re on a WPA2 network, so read on.

Your Best Course of Action

Modem and router manufacturers are very much aware of this issue, and are working hard to make patches for their products available to their customers.

We recommend a visit to the website of the manufacturer of your Wi-Fi modem or router. Determine if they have released a recent update for your model (look for a date within the last month or two). Check out this list of popular modem & router manufacturers to determine whether or not a suitable patch is already offered to protect your make and model against this vulnerability.

Some of you may not be able to install an update for your modem or router on your own. If so, not to worry – most manufacturer’s websites offer support guides, or the option to call them for technical support.

Should I Change My Wi-Fi Network’s Password

It shouldn’t be necessary. Your Wi-Fi network isn’t a factor with this particular security flaw. You’re fine to leave it as it is, and that’ sure to be welcome news for most of you who’d rather not have to create a new one.

This threat gains access to networks via non-primary means, and to use an analogy it’s like a burglar who comes in through a window or down a chimney rather than front door of your house. Your password is guarding your front door, but that’s not where you need to be concerned.

We’ll continue to monitor developments with this new WPA2 security threat and keep you informed as necessary. Be a little proactive on your own part with the recommendations above and you should be good to continue enjoying wireless Internet.

The Rise of ‘Cryptominers’ and Why You Need to Be Wary of Them

Over the past few months we’ve devoted a post or two to rise of cryptocurrencies like Bitcoin and how they’re still worth taking note of despite the fact they haven’t ‘taken off’ quite like people expected them to. Different people have different takes on whether they will ever become a legitimate player on the global currency scene, but we believe that there is in fact going to be a demand for currencies that are not internationally regulated by any specific bodies and can be uniform from one country / currency to the next.

Here at 4GoodHosting, we’re a leading Canadian web hosting provider who also takes a keen interest in developments in the digital world. That’s likely a hallmark of any good provider –staying on top of trends and the like and choosing the most relevant ones to share with their customers.

Right then, let’s continue.

Not-So-Harmless Browsing

It would seem that Internet ads are now the least of your concerns when it comes to annoyances. Recent news indicates that the websites you visit could now be prompting your computer to do what’s called ‘cryptocurrency mining.’ So with an existing understanding of what a cryptocurrency is, we now need to ask what exactly cryptocurrency mining is.

The entirety of the creation, management, conversion, and transaction of digital currencies demands a lot of computing power. Each block of transactions involves computer owners around the globe racing to solve a very challenging cryptographic puzzle, and winning means you get paid in the relevant cryptocurrency. Contestants, known as “miners”, up their chances by building up their processing capacity. Most commonly this is done by building server farms in remote locations where electricity is cheap, but they are always searching for inexpensive ways to mine for cryptocurrencies more effectively.

Conversely, website publishers are always on the hunt for new ways to generate revenue. The standard means – subscriptions, ads, etc. are often insufficient. They don’t have much appeal for most users, can be hijacked, and the big search engines like Google typically take their cut of revenues.

So increasingly these days they are resorting to an unscrupulous approach. They’re offering miners access to the computing power of the people who visit their sites, and then selling it to them.

Browsers Gone Bad

Here’s how this whole seedy transaction works. Say, for example, you go to a site to download some free stock images. As your web browser loads the first page, it also initiates a script that prompts your computer’s processor to undertake calculations for a cryptocurrency miner. That miner could be located anywhere. The only thing that might make you aware of it is a slightly slower computer, and a slightly higher power bill. The miner pays the website publisher for the use of your resources, and you’re kept in the dark about it.

Now we have to say that reputable web publishers will not hijack your computer for profit. It’s sites that haven’t been successful with traditional ad networking (many are in China) who have embraced browser-based mining as a popular revenue stream. Regrettably, at this time there’s little to stop them from doing it, and little in the way of means of blocking them from doing so.

The concept of capturing value from underutilized computer resources isn’t a new one. It actually goes back to the early days of the web. In the late ‘90s a team at the University of California, Berkeley, built the Berkeley Open Infrastructure for Network Computing. It was a software system that took the spare capacity on personal computers and re-dedicated it to scientific purposes. Some of you may remember SETI@home, its most famous application – a screensaver that contributed to the hunt for signs of alien life in radio signals. SETI@home has since aided with climate prediction, drug discovery, protein folding, and many other applications. More than 300,000 users actively participate with it today, and not surprisingly that makes it the largest computing grid in the world.

Doing What They Will

You might think it is, but this type of distributed computing isn’t always cost-effective. Cloud computing would be a much better choice for the type of application suggested above and many others. With browser-based mining, however, visitors are compensating publishers with their computer resources and energy consumption. With the latter part of that comes an involvement with local utility providers in each transaction. Yes, they’d get a better deal by just paying a few cents per page view, but that’s something else altogether.

The Internet has made it quite clear that micropayments don’t work very well, in large part because the decision-making costs associated with each transaction outweigh the actual value transfer. As a result, the most viable kind of internet payment is one that doesn’t look like a payment at all. Keep in mind that hundreds of thousands of volunteers happily donated their computing power to SETI@home because it felt costless, even though it consumed $8 of energy each month. Ad-based models have become the default method because users don’t consciously attach a dollar value to their attention and data.

So while it does occur and is problematic, there’s no debating that in-browser cryptocurrency mining is an inefficient way of paying for content. It’s not clear that users will be accepting of the appropriation of resources, but if you like at it from the other perspective it’s potentially less invasive than targeted advertising. Which is creepy for many people and takes advantage of underutilized processor resources.

Make Smart Browser / Interactivity Choices

No one’s sounding the alarm regarding cryptocurrency mining, but it’s still a growing trend and you certainly don’t want to have yourself as a prime candidate for these leeches. Be smart about the sites you visit, but more importantly be selective about the way you interact them. We won’t go as far as to say to be wary of the site’s seeming source of origin, but if you’re particularly concerned you may want to take this into account too.

Federal Government Taking Out of Country for Web Hosting

The top dogs in the world of web hosting all reside south of the 49th parallel, and their sway of influence over consumers and the way they command the lion’s share of web hosting business is well established down in America. Recent news from the Canadian government, however, suggests that their influence may be making perhaps the biggest of inroads up here in Canada too.

Here at 4GoodHosting, in addition to being a quality Canadian web hosting provider we’re also keenly interested in developments that are both related to web hosting AND are tied directly to any of the different offshoots of the business as it pertains to Canada as a whole. As such, the Canadian Government’s announcement last month that it was moving web hosting for its departmental and agency website related to the domain to Amazon Web Services in the U.S.

March of 2015 saw the government grant a contract to Adobe Corp. for a fully hosted service with a content delivery network, analytics, and hosting environments. Adobe then contracted Amazon Web Services in the U.S. to handle all of the government’s website data.

That contract has been extended by one year, and the value of it has grown exponentially – to $9.2 million.

It would seem that is now no longer as Canadian as it sounds. With all the reputable and reliable web hosting providers in Canada that would have no problem accommodating such a busy client, it’s worth taking a look at why the Federal Government would make this move.

Related to the Cloud & ‘Unclassified’

The Government recently produced a draft plan for cloud computing that recommended that data deemed to be “unclassified” by the government — meaning it’s seen as being of no potential harm on a national or personal level — can be stored on servers outside of Canada.

There is however some debate as to whose responsibility it is to determine what information should be considered sensitive. Further, when information is deemed sensitive, it remains unclear how that data will be stored, and where it will be stored. Of course, this raises some obvious questions on the part of registered Canadians who want to know that personal data is always entirely secure.

Spokespersons have reported that no sensitive information is being stored on the American servers, adding further that as more departments join the website – Canada Revenue Agency being the most notable – there will need to be workarounds implemented to ensure that sensitive data is not relayed on to the American servers.

Cloud Makes $ and Sense for Canada

The appeal of cloud computing for the Canadian government is that it will help them get better value for taxpayers’ dollars, become more streamlined in its operations, as well as better meet the evolving needs of Canadians.

Managed Web Services will be used solely to deliver non-sensitive information and services to visitors. Similarly, secure systems such as a person’s ‘My CRA’ Account will continue to be hosted on separate platforms and servers within the current GC network.

The previous Conservative government spearheaded the move to in 2013, and it was regarded as being a key part of the federal government’s technological transformation. The idea was to help the government save money and become more efficient by creating better efficiencies between the technological needs of the 90 departments and agencies that will be a part of very soon. Prior to all of this, each of the entities had their own website that came with a URL that the majority of people found very hard to remember.

All departments have until December 2017 to take themselves over to the new website.

Marea Reaches Shore: High-Capacity Telecom Cable Now Stretches Across Atlantic

The world of digital and fibre-optic technologies continues to grow in leaps and bounds, and this week we saw one of the most profound examples of just how much of a priority the business world is placing on web-based technologies. Here at 4GoodHosting, we’re a leading Canadian web hosting provider who always has a little more wind in our sails due to the fact that we’re so passionate about anything and everything that pertains to our industry.

As such, the news that a high-capacity fibre optic cable that left Virginia Beach, USA much earlier in the year has now emerged on the coast of Spain is a profound development that definitely excites us and is very much worth sharing with our customers.

‘Marea’ (Spanish for ‘tide’), as the cable has been named, has been funded by Facebook, Microsoft, and Telxius – a subsidiary of the Spanish telecommunications giant Telefónica – and is the highest-capacity cable to have ever crossed the Atlantic. In terms of significance, it represents a weighty shift in the balance of power in the submarine-cable industry. Up until now, transcontinental cables have been funded by telco consortia, and the arrangement tended to be that they would offer capacity on those systems to customers like Facebook and Microsoft for a price.

Recent years have seen skyrocketing demand for global bandwidth, and that trend has made it so that the largest of these customers have had no choice but to join in the funding of construction projects which always cost hundreds of millions of dollars at a minimum.

This one is worth that level of investment and then some. Marea is more than 4,000 miles long and boasts transmission speeds of up to 160 terabits per second. To put that in perspective, it’s roughly 16 million times faster than the average home internet connection and equipped to stream 71 million high-definition videos simultaneously.

It’s well understood that international network bandwidth and traffic have been growing in leaps and bounds, although the growth rate has been slowed notably in recent years.

The fact, however, that this bandwidth and traffic grew at an annual rate well in excess of 30% between 2013 and 2017 does show the need for these types of advances and cross-continental information-exchange infrastructure. Approximately 196 Tbps of new international internet capacity was added over those 4 years, upping global capacity to 295 Tbps, but that figure doesn’t include domestic network routes.

Marea’s capacity is downright impressive, coming in at about 1/15th of that global total. As mentioned, the cable sets out from Virginia Beach, Virginia, on the US side, and lands in Bilbao, Spain. Virginia Beach is 230 miles to the south of Ashburn, Virginia, and that’s very much by design as Ashburn is the largest data center market in North America, and one that’s well on its way to becoming the largest in the world.

The appeal for Facebook and Microsoft is clear too; 4 Microsoft Azure cloud data centers are located in Virginia, and Facebook leases data center space in Virginia too. The Facebook-owned data center that’s nearest to Virginia Beach is about 400 miles to the northwest in Forest City, North Carolina.

Lastly, it’s interesting to note that another submarine cable, this one belonging to Tata Communications, connects the same Spanish town, Bilbao, to England and the UK and then to Portugal. Cables linking Europe to Africa and the Middle East are then accessible from Portugal.

Exciting times for those of us who are beyond keen to have the fastest and most thorough data and network connections for both business and personal pursuits. What’s nearly certain though is that – as hard as it may be to believe – it’s quite conceivable that these new submarine cables may one day become insufficient themselves. Such is the nature and projection of the digital world!



7 WordPress Plug-Ins Guaranteed to Boost SEO Big Time

WordPress continues to be the most predominant web publishing platform around, and the many years it’s had that title is a testament to just how intuitive, versatile, and capable it is for taking your content and making it presentable on the web. The old adage ‘if it ain’t broke don’t fix it’ certainly applies, and while WordPress is elementally the same as it was when first rolled out in 2003.

Here at 4GoodHosting, we’ve always had a front row view of just how well embraced WordPress is in the digital world, and in addition to be a quality Canadian web hosting provider we also try to have our thumb on the pulse of as many aspects of the industry as we can. Page rankings are going to be important for anyone who’s on the web for commercial or promotional purposes. In fact, 61% of marketers say improving SEO and growing their organic presence is their number one priority.

So this week we’re going to share a handful of WordPress plug-ins that are a breeze to install and will serve to improve your site SEO.

  1. Yoast SEO Plugin for WordPress

Feel free to regard Yoast as the Maserati of SEO plugins. It’s usually the first one that will be recommended by an experienced marketer. It is incredibly easy to use and can help you optimize multiple aspects of your WordPress site, addressing and optimizing your URL, meta description, chosen tags, keyword density, internal and external links, and content readability.

It works by first selecting a focus keyword. Next, it will analyze your SEO and provide recommendations on where improvements could be made. Green indicates you’re good as is, orange means your page needs some work, and red means you need to start from scratch as there’s multiple deficiencies. Yoast will then serve up specific actions you can choose to move ‘up’ in the colour spectrum.

Even if you’re decidedly technically inept, you’ll likely have your SEO amped right up with this plugin.

  1. All in One SEO Pack

Yoast definitely takes top spot, but this is quite likely the second best overall SEO plugin (plus the 3+ million installs to date suggest it’s effective). All in One SEO pack was first developed in 2007 and has evolved over the past 10 years to meet the majority of demands today’s SEO marketers tend to have.

All in One SEO Pack includes robust features such as:

  • Automatic meta tag generation
  • Title optimization
  • XML sitemap support for a site that’s more readable to search engines
  • Prevention of duplicate content being created

Essentially, it addresses all of the major elements of effective SEO and – like Yoast – it works with WordPress like a charm.

  1. SEOPressor

SEOPressor also gets high marks from us. This plugin works under the same premise as Yoast and the All in One SEO Pack, delivering comprehensive on-page SEO analysis, as well as providing tips for improvements.

SEOPressor is great as an ‘insta-advisor’, helping you make ideal small tweaks and adjustments that will boost your overall SEO quality. Also, like the preceding two, you don’t need to be anything of a ‘computer whiz’ to get installed and going to work for you.


When top SEO experts like Neil Patel of Kissmetrics and Brian Dean of Backlinko endorse a plug-in, you can A) know it’s good stuff, and B) trust it’s been designed for non-SEO experts.

How SEO SQUIRRLY differs from other plug-ins is that it puts an emphasis on helping you create content that’s designed equally for both search engines and human readers. The importance of this is in the fact that Google places a strong emphasis on positive user experience when orienting their ever-changing algorithms.

SEO SQUIRRL helps you find great keywords, analyzes your articles, offers advice on how to resolve issues, and helps you optimize your content for human consumption, plus it generates an XML sitemap for Google and Bing

  1. SEO Optimized Images

Image optimization is typically a lesser consideration for your WordPress-based site, yet it’s a critical aspect of SEO, and that’s often overlooked. It’s important to ensure that search engines are able to understand the content within your images.

SEO Optimized Images is a WordPress plug-in that makes it easy for inserting SEO-friendly alt attributes dynamically, along with adding valuable title attributes to your images. Long story short, it streamlines the often-laborious process of optimizing the website’s content.

  1. SEO Post Content Links

Any reputable and experienced SEO marketer will tell you that internal linking is of paramount importance for creating a strong link profile. This serves to create better indexing in search engines, it points visitors to other helpful content they may be interested in, which can increase the average amount of time spent on your site.

This is a plugin that takes the guesswork out of internal link building and streamlines the process very impressively. Further, SEO Post Content Links also helps you create proper anchor text that matches current best practices.

  1. SEO Internal Links

Here’s another plugin that’s proven effective for optimizing your site’s internal link structure. Directly from its WordPress description, SEO Internal Links ‘can automatically link keywords and phrases in your posts and comments with corresponding posts, pages, categories, and tags on your blog.’ Enough said? Very likely. SEO internal links is ideal for anyone who’s not so savvy with linking or having a sound understanding of the value of linking and indexing for the website.

In a nutshell, SEO internal links is a convenient way to create internal links, while at the same time avoiding black hat SEO practices that could backfire on you big time should you choose to employ them. Quite plainly, don’t. The damage you can to your site’s credibility in the eyes of the search engine bots isn’t worth the benefits you may get, not at all.

Here’s to you trying one or more and seeing your SEO get a much-needed push up the hill!

Domain Extensions and SEO Impact

Before any website makes its way up onto the information superhighway, the domain name attached to it must be registered with a hosting provider. Here at 4GoodHosting, we’re a top Canadian web hosting provider among many and we can certainly take care of that basic and straightforward formality for you. What we’re going to discuss today, however, is the way that your domain name’s extension (.com being the most common) can have direct and measureable results on your SEO, and search engine ranking more specifically.

Let’s review the basics briefly; a domain name is a unique internet address that is made up of a name and extension (such as .com, .ca etc.). This extension is also referred to as a Top Level Domain (TLD) and it is the most relevant part of your domain name. We’ll move now to putting you in the know with factors that influence choosing the right domain extension and how it dictates your SEO rankings in a significant way.

Various Types of TLDs

In the infancy days of the Web, domain extensions were initially introduced to facilitate browsing across different domains. There were 6 general top-level domains (gTLDs) marketed to folks looking to get themselves up and running, and we saw different domain extension for different types of organizations. Some may be surprised to learn that the .com extension was actually introduced for websites for commercial purposes, and has nothing to do with the term computer.

Much more common nowadays are domain extensions with a country code, also known as country code Top Level Domains (ccTLD). These took off between 1985 and 1990, and examples of these types website name domains are .ca for Canada, .kr for South Korea (who have the fastest internet speeds in the world) .in for India, for the United Kingdom, etc.

1998 saw the creation of the Internet Corporation for Assigned Names and Numbers (ICANN), an international nonprofit organization designed to keep the Internet secure and stable. New gTLDs were released in 2001, including .info and .pro, designed for informational websites or those representing certified professionals.

The number of domain extensions has quickly expanded since. There are now even domain extensions that utilize Arabic characters instead of the usual Latin characters. A complete list of all extensions (with Latin characters) can be referenced at the Internet Assigned Numbers Authority (IANA) website.

Specific SEO Benefits for Each Domain Extension

  • Country code Top Level Domain


A ccTLD provides Google with the strongest and clearest indication of where a website originates. Provided all other SEO factors are equal, the ‘’ website will be better ranked by Google than an ‘’ or ‘’.


The primary disadvantage of a ccTLD is that you will be required to purchase a new extension for each language, which will add to the cost quite considerably. Further, Google’s crawlers (aka ‘bots’) do not recognize multiple websites as one website because they have different extensions. Each website must develop its own authority.

By authority we mean the value that Google assigns to a website. More authority results in Google’s bots staying on the website for a longer period of time and indexing deeper pages of the site. This of course is very beneficial for SEO. Higher authority leads to a greater likelihood that your site will rank high on Google’s SERPS (search engine results pages). There are other factors that determine how well a website performs in this regard, and in fact Google uses more than 200 signals to determine which results are most relevant.

  • Generic Top Level Domain

Generic domain names are increasingly popular these days, with examples like .pizza, .amsterdam and .club, websites that distinguish the nature of the business or venture very explicitly. People continue to speculate about the advantages and disadvantages of these new extensions as they relate to search engine rankings. Google has shared that the new TLDs are not more likely to score high with Google than older TLDs or ccTLDs. However, there are several examples that suggest otherwise, at least to some extent. is one of them. It climbed to the first page in Google US search results within the span of a week. That’s worth taking note of, as it takes a lot of time to get to the first page on Google US, and that can be true even if you’ve built up plenty of authority. was purchased in November 2014 and received several links from authoritative websites that announced the transaction. The backlinks had 80% of ‘’ as clickable text, and one week after the launch the website was already on the first SERP for the term ‘coffee club’. We can understand that when a gTLD (in part) matches a keyword you want to match in Google, it counts only links with the domain name in the clickable text.

Simply, ‘’ is interpreted by Google to be “coffee club”. In such instances a TLD with a relevant keyword will indeed have an SEO advantage over a traditional TLD like those ending with a .com.

Google still insists that there is no advantage or disadvantage to having a new gTLD, stating that each gTLD has the same opportunity to rank well. With a gTLD, it is possible to specify which country the website is intended to serve within the Google Search Console. This of course is done via international targeting, but keep in mind that when you expand your website with a different language you must adjust or disable international targeting.

Choosing the most appropriate domain extension

Your best choice for a TLD will depend on a number of factors. Want to score well on Then you’ll be best served by choosing the overall top level domain, a .com. Conversely, if you only sell products in Canada, you’ll be wise to choose the .ca extension. Google will then recognize that your website is intended for the Canadian market and that your aim is to score better on

It continues to be that SEO is often not taken into account when people are weighing which TLD extensions is best for them. For example, there are websites that buy a ccTLD so the website has a nice name and is easy to remember. For example, ‘’ may seem like the ideal choice for the nature of your business, but it’s probably not going to score well on This is because you indicate to Google with the .ca extension that your website is taking aim at the Canadian market explicitly.

When your website is in fact targeted to a specific country, though, it is advisable to choose the ccTLD of that country. In this case, you may need to purchase a new domain with another TLD at any international expansion. The country-specific nature of the ccTLD will definitely have a positive impact on your search engine results.

When you go with a gTLD, Google will not see it as a .com, .pizza, or .whateveritmaybe. GTLDs have as much chance to score well and as a result do not affect SEO status of your website. And yet, even while Google insists on the validity of that, there are cases like those mentioned above that show that links with only the domain name in the clickable text are counted in Google search results. This is the case when a gTLD will create a partial match with a keyword you want to match.

The important thing to keep in mind when using a gTLD is that you communicate this choice to the consumer. Consumers will often undertake searches including the domain extension in the search terms. If you choose a gTLD, make sure that you make that fact very clear to your target audience, and that’s most commonly done by presenting your company name WITH the extension attached in Headers or any other component of the communication piece that will be visually grabbing and readily identified

Also – last but not least – go into your Google Search Console and make sure to set the international targeting to the right country.

One Play Ahead: Trends for Web & App Hosting

A big part of what makes an elite offensive player who he is on the ice is the ability to think the game one-play ahead. Gretzky was less concerned with where the puck was and more with where it was going to be next, along with knowing exactly what he’d do with it once the puck was on his stick. Here at 4GoodHosting, we’re a top Canadian web hosting provider who similarly likes to look ahead at trends is the web and app hosting world that will dictate how we should adapt to best serve our customers.

This blog post is based on data from a comprehensive report from 451 Research, and it gives significant insight on where the marketplace should be within 2+ years. It highlights in particular the meteoric rise in demand for managed web hosting in Canada, and how growth for web and application hosting has slowed predictably in recent years.

That’s not necessarily cause for alarm, though – it just means the plays are slower to develop now. Technology is evolving. All you have to do is take the pulse of your own web or app hosting business. Workloads tend to be moving out of the web and app hosting category, and that’s true of some products as well.

Many are responding by shuffling the IT services deck for data-gathering purposes. More and more service providers are specializing, serving a narrower or niche target market. New service categories are emerging, and we realize that we need to analyze the user preferences of our customers very insightfully right now to see where we can best put the bulk of our services technology to work for you.

Here are the numbers of the report, with three statistical predictions:

  1. As a category, web and app hosting will grow from $18.2 billion in 2015 to $25.8 billion by 2019.
  2. Total hosting revenue will increase at an annualized rate of 15.5%. What’s interesting is that the “balance of power” in terms of revenue drivers has shifted. Managed hosting is growing at a far faster rate than web/app hosting.

Here’s how that 15.5% breaks down:

  • Dedicated hosting should grow about 5.7% per year
  • Shared hosting should grow about 10.4% per year
  • Managed hosting should about 18.7% per year
  1. In market share:
  • Web/app hosting will drop from 36.8% to 28.5%
  • Managed hosting will increase a mammoth 71.5%

Promoted Changes

The evolution of technology has changed the way every business competes. There have been discernible shifts in the way customers function and think about IT, and it necessitates changes to the way folks like us will approach our future moves regarding web and app hosting.

A reduced number of workloads need to be managed as part of service delivery. Internet-based infrastructure is increasingly common these days, and ever greater numbers of enterprise workloads exist in hosted environments. IAAS is gaining a lot of ground with web masters whose workloads previously existed as a dedicated hosting environment or VPS.

Further, certain environments are now considered to be part of managed hosting. Increasing modularity of managed services means more versatility, and it’s timely for a widening range of infrastructure types and applications.

Constant Change

Identifying and understanding trends is a must for hosting providers. As a business in this industry you need to keep your feet moving and have your head on a swivel, again like your anticipating where the play is going and the puck is going to be.

Customers are going to be struggling to find these new IT solutions for their businesses, and we imagine every reputable Canadian web hosting provider is going to be very proactive in responding to the new industry realities.

Promising Predictions

The ever-constant growth of the web for business continues to steam ahead as a whole. 451 Research volunteers that the sector should see an additional $7.5B in revenue each of the next few years. That’s a large pie to be pieced, but those who want a little more of it will have to reinvent their business model and very likely the marketing strategy that goes along with it.

Continued growth for web and app hosting will primarily come from 2 sources:

  • Adding new subscribers to grow your customer base
  • Adding new services you can sell to existing customers

The Appeal of Hybrid Cloud Hosting

Most of you will need no introduction to the functionality and application of cloud computing, but those of who aren’t loaded with insight into the ins and outs of web hosting may be less familiar with cloud hosting and what makes it significantly different from standard web hosting. Fewer still will likely know of hybrid hosting and the way it’s made significant inroads into the hosting market with very specific appeals for certain web users with business and / or management interests.

Here at 4GoodHosting, we’ve done well establishing ourselves as a quality Canadian web hosting provider, and a part of what’s allowed us to do that is by having our thumb on the pulse of our industry and sharing those developments with our customers in language they can understand. Hybrid hosting may well be a good fit for you, and as such we’re happy to share what we know regarding it.

If we had to give a brief overview of it, we’d say that hybrid hosting is meant for site owners that want the highest level of data security along with the economic benefits of the public cloud. Privacy continues to be of a primary importance, but the mix of public and private cloud environments and the specific security, storage, and / or computing capacities that come along with the pairing are very appealing.

What Exactly is the Hybrid Cloud?

This combination of private and public cloud services communicate via encrypted technology that allows for data and / or app portability, consisting of three individual parts; the public cloud / the private cloud / a cloud service and management platform.

Both the public and private clouds are independent elements, allowing you to store and protect your data in your private cloud while employing all of the advanced computing resources of the public cloud. To summarize, it’s a very beneficial arrangement where your data is especially secure but you’re still able to bring in all the advanced functionality and streamlining of processes that come with cloud computing.

If you have no concerns regarding the security of your data, you are; a) lucky, and b) likely to be quite fine with a standard cloud hosting arrangement.

If that’s not you, read on…

The Benefits of Hybrid Clouds

One of the big pluses for hybrid cloud hosting is being able to keep your private data private in an on-prem, easily accessible private infrastructure, which means you don’t need to push all your information through the public Internet, yet you’re still able to utilize the economical resources of the public cloud.

Further, hybrid hosting allows you to leverage the flexibility of the cloud, taking advantage of computing resources only as needed, and – most relevantly – also without offloading ALL your data to a 3rd-party datacenter. You’re still in possession of an infrastructure to support your work and development on site, but when that workload exceeds the capacity of your private cloud, you’re still in good hands via the failover safety net that the public cloud provides.

Utilizing a hybrid cloud can be especially appealing for small and medium-sized business offices, with an ability to keep company systems like CRMS, scheduling tools, and messaging portals plus fax machines, security cameras, and other security / safety fixtures like smoke or carbon monoxide detectors connected and working together as needed without the same risk of web-connection hardware failure or security compromise.

The Drawbacks of Hybrid Clouds

The opposite side of the hybrid cloud pros and cons is that it can be something of a demanding task to maintain and manage such a massive, complex, and expensive infrastructure. Assembling your hybrid cloud can also cost a pretty penny, so it should only be considered if it promises to be REALLY beneficial for you, and keep in mind as well that hybrid hosting is also less than ideal in instances where data transport on both ends is sensitive to latency, which of course makes offloading to the cloud impractical for the most part.

Good Fits for Hybrid Clouds

It tends to be a more suitable fit for businesses that have an emphasis on security, or others with extensive and unique physical data needs. Here’s a list of a few sectors, industries, and markets that have been eagerly embracing the hybrid cloud model:

  • Finance sector – the appeal for them is in the decreased on-site physical storage needs and lowered latency
  • Healthcare industry – often to overcome regulatory hurdles put in place by compliance agencies
  • Law firms – protecting against data loss and security breaches
  • Retail market – for handling compute-heavy analytics data tasks

We’re fortunate that these types of technologies continue to evolve as they have, especially considering the ever-growing predominance of web-based business and communication infrastructures in our lives and the data storage demands and security breach risks that go along with them.