2 Weeks To HTTPS Becoming a Necessity for Websites

It’s July 9th and two weeks from today the web is officially going with full HTTPS as requisite, and that’s a development that’s been a long time in the making. Securing traffic on the internet is an obvious priority, but of course there are people who are strongly opposed to having a secure web.

Two weeks today Google will be uniformly labeling any site loaded in Chrome without HTTPS to be not secure. Most webmasters will be on top of this and accordingly usage of HTTPS is exploding right now. In the 6 months up to a recent report, 32% growth in the use of HTTPS was seen in the top 1 million sites. Mozilla tracks anonymous telemetry via Firefox browser and recorded big growth (75% page loads) in the rate of pages being loaded over HTTPS. Chrome too, at around the same 75 percent.

We’re a Canadian web hosting provider who’s always got our thumb on the pulse of the industry, so it’s important to relate that quite a few popular sites on the web still don’t support HTTPS (or fail to redirect insecure requests) and will soon be flagged by Google. Plus, let’s clear up a few emerging myths about HTTPS:

  • It’s a Hassle
  • I Don’t Need It
  • It’s Gonna be Slow
  1. It’s A Hassle

No, it’s pretty darn simple. You can protect your site with HTTPS in a matter of seconds for FREE. Sign up for Cloudflare or using a CA such as Let’s Encrypt. We can assist you with any other web security and accessibility concerns you may have beyond https encryption of your website.

  1. I Don’t Need It

Well it turns out, you do – particularly as it relates to the safety and privacy of those visiting your site. Without HTTPS, anyone in the path between your visitor’s browser and your site or API can peer in on (or make modifications to) your content without you needing to be made aware of it. Governments, employers, and even especially internet service providers can and have been overseeing content without user consent.

If having your users receiving content unmodified and safe from maliciously injected advertisements or malware is a priority for you, you are advised to move your website to HTTPS.

Add the fact that the major browsers like Apple, Google, Mozilla, and Microsoft, are restricting functionality to only work over HTTPS. Google will soon block unencrypted mobile app connections automatically in their upcoming Android version. Apple has announced that apps must use HTTPS, but there has been no official announcement of this yet.

  1. It’s Gonna be Slow

The last common myth about HTTPS is that it’s not speedy enough. This belief is a holdover from an era when SSL/TLS might have had a negative performance impact on a site, but that’s not the way it is today at all or ever. HTTPS is also now required to enable and enjoy the performance benefits of HTTP/2.

Here’s two untruths to consider:

1) It takes incrementally more CPU power to encrypt and decrypt data; and

2) establishing a TLS session involves nothing more than 2 network round trips between the browser and the server.

HTTPS content from the edge – 10-20 milliseconds away from your users in the case of Cloudflare – SSL/TLS enabled sites are superior. And even when they are not served from an edge provider they still function at a high level. Advanced users should also consider using HSTS to instruct the browser to always load your content over HTTPS, saving it a round trip (plus page load time) on following requests.

Google is Blocking Ad Blockers in Chrome: Paid Web Browsers the Future

Many people lament the fact that the Internet can’t be an unimpeded digital information source and not have commercial interests to the extent it does. It would be nice if it was a fountain of knowledge that exists for everyone’s own information gathering exclusively, but living in the world we do when there’s a buck to be made somewhere the opportunity will be taken. It’s especially frustrating for people who aren’t big consumers and have never clicked on a link or purchased very little online.

Google has recently moved to limit Chrome’s ad-blocking capabilities, and no doubt many of you using an ad-blocker will have already noticed this. Google also announced that this feature will not apply for Google’s paid G Suite Enterprise subscribers. Here at 4GoodHosting, we’re a Canadian web hosting provider who keeps our thumbs on the pulse of the digital world and the prospect of ad-free internet browsing only via paid web browsers would be a pretty big deal for nearly all of us who source information online.

According to a recent study, as many as 40% of people browsing the web from laptops use an ad blocker. That’s a big group of people that aren’t viewing Google’s ads. So why’s this happening, and what’s the underlying current here?

Beyond Blocked Blockers

It’s been reported in the news how Chrome users – and developers of Chrome-friendly, ad-blocker extensions – are none too pleased with Google’s proposed changes to the Chrome Extensions platform. We have to go back to when Google announced Manifest V3, which constituted a set of proposed changes to Google Chrome’s Extensions platform.

In it, specific changes to Chrome’s webRequest API were proposed with an eye to limiting the blocking version of it and this potentially would remove blocking options from most events and creating them as observational only. Content blockers would now use a different API instead, known as a ‘declarativeNetRequest.’ The Manifest concluded that this new API is “more performant and offers better privacy guarantees to users.”

The reality is though that Google’s Manifest V3 changes will prevent Chrome’s ad-blocker extensions from using the webRequest API as it normally, but it will also force them to use a new API (declarativeNetRequest). One that isn’t compatible with how existing popular adblocker extensions function and making them ineffective.

It’s fairly clear to see that Google is being receptive to the concerns of paying advertisers in ensuring the delivery of their ads to site visitors, and they’re not going to be supportive of ad blockers from now own.

A recent industry publishing had a statement from a spokesperson at Google regarding these changes in Chrome – “Chrome supports the use and development of ad blockers. We’re actively working with the developer community to get feedback and iterate on the design of a privacy-preserving content filtering system that limits the amount of sensitive browser data shared with third parties.”

They then added further, “for managed environments like businesses, we offer administration features at no charge.”

For now, Google is still intending to block ad blockers in Chrome, while people who are subscribed to their G Suite Enterprise-level of services will enjoy ad-free viewing.

Pay to Play Soon?

In the past it was that Chrome could be an ad-free browsing experience at no additional cost. Now it seems you’ll have to subscribe to premium G Suite services, and the highest, most expensive version of it. How much? It’s $25 per user, per month, and that’s no small change for any type of online monthly service.

It’s not difficult to figure out what’s Google’s interest in doing this. They can increase the amount of revenue generated from users viewing ads if non-Enterprise subscribing users, based in large part because most people won’t pay for G Suite and more of them will see ads they’ll click through.

Keep in mind that Google’s competitors like Microsoft Edge and Firefox are still fine with supporting ad blockers, so it’s fair to assume they’ll be people who’ll abandon Chrome for another browser. Even if they think Chrome is superior, as there are many people who simply can’t stand ads and particularly if they’re researching for work or academic purposes and time is of the issue.

Google’s G Suite’s low and mid-tier subscribers will still be seeing ads too, it’s only the 25-a-month subscribers who’ll be enjoying ad-free browsing. G Suite Basic is $6 dollars per user per month and G Suite Business is $12 per user month.

Any of you planning to jump ship if your ad blocker is rendered useless?

Chromium Manifest V3 Updates May Disable Ad Blockers

It’s likely that a good many of you are among the thousands upon thousands of people who have an Ad Blocker installed for your web browsers of choice. Some people do use them simply to avoid the nuisance of having to watch ad after ad, and it’s people like these that have necessitated some sites to insist that you ‘whitelist’ them in order to proceed into the website they want to visit. That’s perfectly understandable, as those paying advertisers are the way the website generates income for the individual or business.

For others, however, we spend a great deal of our working day researching and referencing online, and having to watch ads before getting to the content we need in order to do our work. For us, an ad blocker is much more of a tool of necessity rather than convenience. Still, we get caught up in more than a few sites that will insist on being whitelisted too. For me, my ad blocker is a godsend and I don’t whitelist any website or disable my ad blocker for any of them.

Here at 4GoodHosting, part of what makes us a good Canadian web hosting provider is having built up an insight into what really matters to our customers. The bulk of them are people who use the Information Superhighway as a production resource rather than web ‘surfers’ for whom it’s more of an entertainment one. That’s why today’s news is some that’s sure to be very relevant for most of our customers.

Weakened WebRequest APIs

Some of you may not know how your ad blocker works, and that’s perfectly normal. As long as it does its job, you don’t really need to know. Chromium is Google’s newest all-powerful web browser, and just like Chrome did you can expect it to soon become nearly ubiquitous as most people’s web browser of-choice.

However, Chromium developers in the last few weeks have shared that among the updates they are planning to do in Manifest V3 is one that will restrict the blocking version of the webRequest API. The alternative they’re introducing is called declrativeNetRequest API.

After becoming aware of it, many ad blocker developers expressed their belief that the introduction of the declarativeNetRequest API will mean many already existing ad blockers won’t be ‘blocking’ much of anything anymore.

One industry expert stated on the subject, “If this limited declarativeNetRequest API ends up being the only way content blockers can accomplish their duty, this essentially means that two existing and popular content blockers like uBO and uMatrix will cease to be functional.”

What is the Manifest V3 Version?

It’s basically a mechanism through which specific capabilities can be restricted to a certain class of extensions. These restrictions are indicated in the form of either a minimum, or maximum, version.

Why the Update?

Currently, the webRequest API allows extensions to intercept requests and then modify, redirect, or block them. The basic flow of handling a request using this API is as follows,

  • Chromium receives the request / queries the extension / receives the result

However, in Manifest V3 the use of this API will have its blocking form limited quite significantly. The non-blocking form of the API that permits extensions to observer network requests for modifying, redirecting, or blocking them will not be discouraged. In addition, the limitations they are going to put in the webRequest API have yet to be determined

Manifest V3 is set to make the declarativeNetRequest API as the primary content-blocking API in extensions. This API will then allow extensions to tell Chrome what to do with a given request, instead of Chromium forwarding the request to the extension. This will enable Chromium to handle a request synchronously. Google insists this API is overall a better performer and provides better privacy guarantees to users – the latter part of which if of course very important these days.

Consensus Among Ad Blocker Developers and Maintainers?

When informed about this coming update many developers were concerned that the change will end up completely disabling all ad blockers. The concern was that the proposed declarativeNetRequest API will result in it being impossible to develop new and functional filtering engine designs. This is because the declarativeNetRequest API is no more than the implementation of one specific filtering engine, and some ad blocker developers have commented that it’s very limited in its scope.

It’s also believed that the declarativeNetRequest API developers will be unable to implement other features, such as blocking of media element that are larger than a set size and disabling of JavaScript execution through the injection of CSP directives, among other features.

Others are making the comparison to Safari content blocking APIs, which essentially put limits on the number of admissible rules. Safari has introduced a similar API recently, and the belief is that’s the reason why Apple has gone in this direction too. Many seem to think that extensions written in that API are more usable, but still fall well short of the full power of uBlock Origin. The hope is that this API won’t be the last of them in the foreseeable nearest future.

Dedicated IP Addresses and SEO

Even the most layman of web endeavourers will be familiar with the acronym SEO. We imagine further there’s very few if any individuals anywhere who don’t know it stands for search engine optimization, and understand just how integral SEO is for having success in digital marketing. Most people with a small business that relies on its website for maximum visibility with prospective customers will hire an SEO professional to SEO optimize their site. That continues to be highly recommended, and for 9 out of 10 people it is NOT something you can do effectively on your own, no matter how much you’ve read online or how many YouTube videos you’ve watched.

Here at 4GoodHosting, we are like any other top Canadian web hosting provider in that we offer SEO optimization services for our clients. Some people will think that choosing the best keywords and having them at the ideal density is most integral to having good SEO, and that’s true and by and large. But there are a number of smaller but still significant influence that influence SEO, and they’ll be beyond the wherewithal of most people.

Whether websites benefit from a Dedicated IP address rather that a Shared IP address isn’t something you’ll hear discussed regularly. When you learn that the answer is yes, they do, and exactly why, however, it’s a switch many people will want to consider if they currently have a Shared IP address. Let’s have a look at why that is today.

What Exactly Is an IP address?

For some, we may need to start at the start with all of this so let’s begin be defining what exactly an IP address is. Any device connected to the Internet has a unique IP address, and that’s true if it’s a PC, laptop, mobile device, or your web host’s server. It’s made up of a 4-number string which will start at 0 and then go up to 255. Here’s an example of one:

1.25.255.255

This numerical string code makes the machine you are using known. Once it’s identified – and it has to be – the Internet is then able to send data to it. You now can access the hundreds of thousands of websites along the Information Superhighway.

What’s a Shared IP address?

In most instances, the server your web host uses to host your site will be a single machine with a matching single IP address. For most people – and nearly all who go with the most basic hosting package without giving it much thought – you’ll be set up in an arrangement where the server is hosting thousands of websites like yours. It’s not ‘dedicated’ to you and your site exclusively.

Instead, all of the websites hosted it will be represented by the single IP address allocated to the web host’s server. Now if your website is utilized for more of a personal venture or hobby and it’s NOT going to be a leverage point in trying to secure more business, shared hosting will probably be fine. Alternately, if page rankings are a priority for you then shared hosting may be putting you at a disadvantage.

The solution? A dedicated IP address for your Canadian website. If you need one, we can take care of that for you quickly and fairly easily for you. But we imagine you’ll need more convincing, so let’s move now to explaining what constitutes a Dedicated IP address..

The Dedicated IP Address

A dedicated IP address involves you having your own server, and that server only has one website on it – yours. It is common, however, for more than one site reside on a specific server. A Dedicated IP address is an IP address that is allocated to a single website, instead of one being assigned the server and representing every website hosted there by default.

The Purpose of Dedicated IP Addresses

The primary appeal of Dedicated IP addresses is that they promote large ecommerce being more secure, and in particular as it regards sensitive data like credit card numbers, etc. On a more individual scale, though, a dedicated IP address is superior for SEO interests as well.

Why is that? Let’s list all of the reasons here:

1. Speed

When you share space, you share resources and in as far as shared web hosting and shared IP addresses are concerned that means you are sharing bandwidth. The long and short of it is all those other sites on the same server will be slowing yours down. That might be a problem in itself, but if it isn’t then the way slow site speeds push you further down Google’s rankings will be.

While adding a unique IP address to your site will not automatically mean it loads faster, but migrating to a Dedicated Server with a Dedicated IP address definitely will. Sites with a Dedicated IP address are faster, more reliable, and more secure, and that’s a big deal.

2. SSL

For nearly 5 years now Google has been giving preference to websites that have added an SSL 2048-bit key certificate. The easiest way to see whether that’s been done or not is seeing the site’s URL change from HTTP to HTTPS. SSL sites typically utilize unique IP addresses. Google continues to insist that SSL impacts less than 1% of searches, but it’s a factor nonetheless and is another benefit of a Dedicated IP address.

SSL can make your website more visible through public networks and can make websites operate marginally faster, and the benefit of this is in the way visitors get a faster response from the website because it’s not held up by Google the way it would be if it didn’t have an SSL cert. The majority of ecommerce sites with a Dedicated IP address will also have an SSL cert.

3. Malware

Malware is software that’s designed and disseminated for the explicit purpose of throwing wrenches into the gears of a working web system. Unfortunately, the thousands of websites that may be on a shared server drastically increases the risk of being exposed to malware if you’re one of them. Further, when you share an IP address with any site that’s been infected with malware then your site is actually penalized despite the fact it’s not you who’s been infected.

In these cases, you’ll be best served by going with a Dedicated IP address and choosing a more reliable Canadian web hosting provider that has measures in place to protect malware from making its way into the servers in the fist place. A dedicated IP means you’re standing alone, and you’re regarded accordingly.

How Do I Get a Dedicated IP Address?

If you’re with us here at 4GoodHosting, all you need to do is ask. We’ve been setting our customers up with Dedicated IP addresses for quite some time now, and you’ll find that when you do so through us it’s not nearly as pricey as you had expected it to be.

It’s very recommended for any ecommerce site or one that’s utilized for very business-strategic aims, and it’s fair to say that you really can’t go wrong moving to a dedicated server if you’ve made the commitment to do anything and everything to protect your SEO and enjoy the same page rankings moving forward. The vast majority of people see it as a wise investment, and of course you always have option of switching back to a shared hosting arrangement if over time you don’t see any real difference or benefits for you.

Cloudflare is changing the game

In a world where Google, Amazon and Facebook dominate the tech space, Cloudflare has stolen away the headlines for the betterment of the internet with its recent announcement. The company announced on its 8th birthday that they would be launching a domain registry, and it is unlike any we have seen before.

Cloudflare, to the shock of many in the industry, has decided not to charge anything above the federally mandated cost to register a domain with the government. That is right; this multi-billion dollar company has chosen to not make a single penny off of your domain registration. In a world where the average Canadian spends between $10-$15 per domain, this is remarkable.

Cloudflare is not a small company and is about the same scale as Google at the moment. It has a core set of business that sees itself as a content distribution platform and secure infrastructure vendor for millions of client across the globe. It also has recently announced it is on a path to an IPO and has raised hundreds of millions of dollars in preparation for this. So why do this?

Cloudflare is a unique company in the tech and capital market as they are doing two different things than any other major brand. First, the company does not see the internet as a property that you can corner, and instead looks to promote a free, equal and open internet, much like the values from Internet 1.0. Secondly, the company is doing things for the good of the internet, and although this might ultimately fail once the company scales, it is still a refreshing view from a larger company in the tech space.

This does leave one important question for consumers, what does this mean for the cost and registration of their domain? Well, it is a little up in the air. The Cloudflare system is still being tested and should be live within the month, but it looks to be set up similar to every other registry system. If you are up for renewal, it might be time to take a look around and see if you can benefit from using this new system. As well, for those who are operating hosting or other third party services, your overall cost to your company to get a website should start to drop for your packages if you choose Cloudflare as your registry option.

However, this does still leave some questions. Will the other registry companies like GoDaddy also drop their prices, or will they continue the same old costing options going forward? As well, if you are looking for other nations or domain names, will Cloudflare offer those? Finally, will Cloudflare provide an easy to use swapping option? These are all tough questions, and we will need to wait and see how Cloudflare’s announcement has changed the industry in only a few short weeks.

What are your thoughts? Is this just a bump in the road for the major registry options on the web, or the start of more competitive space for those looking to register domains?

What’s in a ‘Tweet’: Understanding the Engagement-Focused Nature of Twitter’s Algorithm

It would seem that of all the social media platforms, Twitter is the one that businesses struggle with most in understanding just how to harness it for effective promotional means. The common assumption is any shortcomings are related to your use of the ever-ubiquitous #hashtag, but in fact they’re not nearly as pivotal as you might think.

Here at 4GoodHosting, we’ve done well in establishing ourselves as a premier Canadian web hosting provider and a part of that is sharing insights on how to get more out of your online marketing efforts. Social media is of course a big part of that, and as such we think more than a few of you will welcome tips on how to ‘up’ your Twitter game.

It’s easy to forget that these social media platforms have algorithms working behind them, and working quite extensively. What’s going on behind the screen controls and narrows down what you actually see on your timeline.

For example, let’s say you have specific political affiliations. The algorithms ensure that the majority of the tweets you’ll see will be linked to that party’s views. Or perhaps you’re especially into sports. If so, plenty of sports news sources will be all over your timeline. Oppositely, if you dislike something then that theme will slowly end up disappearing over the course of the week or beyond.

All of this is a reflection of ALL social media platforms, Twitter included, are using more and more complex algorithms to satisfy their user base and deliver content they are likely to find favourable.

So this is what do you’ll need to know about Twitter’s algorithms, and the best ways to use them to your advantage.

Keep Your Eyes Peeled For These

There’s no disputing the fact that Twitter has faded quite considerably in popularity and the strength of its reach. Despite this, Twitter is really narrowing its scope of engagement and a key way to increase engagement is through increasing relevance of the posts seen.

Directly from Twitter’s engineering blog, here are a few of the factors that decide whether a Tweet is sufficiently engaging and thus worthy of ‘appearances’

  • The level of recency to your posts, the likes, retweets, and other things such as attached media
  • Whether you have previously liked, or retweeted the author of the tweet
  • Your previous positive interaction with certain types of tweets

Twitter will then recommend people to like over the next couple of days. Depending on your responses to those recommendations, it will then adjust the content that’s seen by you to better reflect how it is gauging your preferences.

What’s easy to conclude is that users themselves play a predominant factor in what’s going to be seen on their timelines. Liking or using the “I don’t like this” button once or twice goes a long way in this regard.

By this point it begs asking the question; is Twitter’s algorithm perhaps a little too simple? It is definitely not as complex as other media platforms such as Facebook, but the benefit in that is that it is easier to manipulate. Among the many benefits of this is the way that smaller companies may tag a random brand or company in a tweet that is completely non-associable with their tags. Twitter’s algorithms allow this to be a very effective means of getting increased exposure.

Gain Your Advantage

Generating engagement with your tweets is a reliable way to boost exposure and put yourself on top of the algorithm game. Engaging your audience and boosting exposure keeps you ‘in’ the talk and seeing to it you’re using the correct hashtags will ensure you’re being talked about.

Smaller companies can benefit from tagging large companies in their tweets to gain exposure, and that’s especially advisable if the company relates to what you’re talking about. Sure, it only works to a certain degree, but gaining followers by any means possible is always a plus.

Putting all this talk about engagement into perspective, it’s important to understand how to spark the right sorts of conversation. Asking random questions will make it look forced, while if you don’t interact at all you may see a dip in exposure. Find a way to be genuine in your responses, and adhere faithfully to what you’ve defined as your brand’s voice.

Federal Government Taking Canada.ca Out of Country for Web Hosting

The top dogs in the world of web hosting all reside south of the 49th parallel, and their sway of influence over consumers and the way they command the lion’s share of web hosting business is well established down in America. Recent news from the Canadian government, however, suggests that their influence may be making perhaps the biggest of inroads up here in Canada too.

Here at 4GoodHosting, in addition to being a quality Canadian web hosting provider we’re also keenly interested in developments that are both related to web hosting AND are tied directly to any of the different offshoots of the business as it pertains to Canada as a whole. As such, the Canadian Government’s announcement last month that it was moving web hosting for its departmental and agency website related to the Canada.ca domain to Amazon Web Services in the U.S.

March of 2015 saw the government grant a contract to Adobe Corp. for a fully hosted service with a content delivery network, analytics, and hosting environments. Adobe then contracted Amazon Web Services in the U.S. to handle all of the government’s website data.

That contract has been extended by one year, and the value of it has grown exponentially – to $9.2 million.

It would seem that Canada.ca is now no longer as Canadian as it sounds. With all the reputable and reliable web hosting providers in Canada that would have no problem accommodating such a busy client, it’s worth taking a look at why the Federal Government would make this move.

Related to the Cloud & ‘Unclassified’

The Government recently produced a draft plan for cloud computing that recommended that data deemed to be “unclassified” by the government — meaning it’s seen as being of no potential harm on a national or personal level — can be stored on servers outside of Canada.

There is however some debate as to whose responsibility it is to determine what information should be considered sensitive. Further, when information is deemed sensitive, it remains unclear how that data will be stored, and where it will be stored. Of course, this raises some obvious questions on the part of registered Canadians who want to know that personal data is always entirely secure.

Spokespersons have reported that no sensitive information is being stored on the American servers, adding further that as more departments join the Canada.ca website – Canada Revenue Agency being the most notable – there will need to be workarounds implemented to ensure that sensitive data is not relayed on to the American servers.

Cloud Makes $ and Sense for Canada

The appeal of cloud computing for the Canadian government is that it will help them get better value for taxpayers’ dollars, become more streamlined in its operations, as well as better meet the evolving needs of Canadians.

Managed Web Services will be used solely to deliver non-sensitive information and services to visitors. Similarly, secure systems such as a person’s ‘My CRA’ Account will continue to be hosted on separate platforms and servers within the current GC network.

The previous Conservative government spearheaded the move to Canada.ca in 2013, and it was regarded as being a key part of the federal government’s technological transformation. The idea was to help the government save money and become more efficient by creating better efficiencies between the technological needs of the 90 departments and agencies that will be a part of Canada.ca very soon. Prior to all of this, each of the entities had their own website that came with a URL that the majority of people found very hard to remember.

All departments have until December 2017 to take themselves over to the new Canada.ca website.

One Play Ahead: Trends for Web & App Hosting

A big part of what makes an elite offensive player who he is on the ice is the ability to think the game one-play ahead. Gretzky was less concerned with where the puck was and more with where it was going to be next, along with knowing exactly what he’d do with it once the puck was on his stick. Here at 4GoodHosting, we’re a top Canadian web hosting provider who similarly likes to look ahead at trends is the web and app hosting world that will dictate how we should adapt to best serve our customers.

This blog post is based on data from a comprehensive report from 451 Research, and it gives significant insight on where the marketplace should be within 2+ years. It highlights in particular the meteoric rise in demand for managed web hosting in Canada, and how growth for web and application hosting has slowed predictably in recent years.

That’s not necessarily cause for alarm, though – it just means the plays are slower to develop now. Technology is evolving. All you have to do is take the pulse of your own web or app hosting business. Workloads tend to be moving out of the web and app hosting category, and that’s true of some products as well.

Many are responding by shuffling the IT services deck for data-gathering purposes. More and more service providers are specializing, serving a narrower or niche target market. New service categories are emerging, and we realize that we need to analyze the user preferences of our customers very insightfully right now to see where we can best put the bulk of our services technology to work for you.

Here are the numbers of the report, with three statistical predictions:

  1. As a category, web and app hosting will grow from $18.2 billion in 2015 to $25.8 billion by 2019.
  2. Total hosting revenue will increase at an annualized rate of 15.5%. What’s interesting is that the “balance of power” in terms of revenue drivers has shifted. Managed hosting is growing at a far faster rate than web/app hosting.

Here’s how that 15.5% breaks down:

  • Dedicated hosting should grow about 5.7% per year
  • Shared hosting should grow about 10.4% per year
  • Managed hosting should about 18.7% per year
  1. In market share:
  • Web/app hosting will drop from 36.8% to 28.5%
  • Managed hosting will increase a mammoth 71.5%

Promoted Changes

The evolution of technology has changed the way every business competes. There have been discernible shifts in the way customers function and think about IT, and it necessitates changes to the way folks like us will approach our future moves regarding web and app hosting.

A reduced number of workloads need to be managed as part of service delivery. Internet-based infrastructure is increasingly common these days, and ever greater numbers of enterprise workloads exist in hosted environments. IAAS is gaining a lot of ground with web masters whose workloads previously existed as a dedicated hosting environment or VPS.

Further, certain environments are now considered to be part of managed hosting. Increasing modularity of managed services means more versatility, and it’s timely for a widening range of infrastructure types and applications.

Constant Change

Identifying and understanding trends is a must for hosting providers. As a business in this industry you need to keep your feet moving and have your head on a swivel, again like your anticipating where the play is going and the puck is going to be.

Customers are going to be struggling to find these new IT solutions for their businesses, and we imagine every reputable Canadian web hosting provider is going to be very proactive in responding to the new industry realities.

Promising Predictions

The ever-constant growth of the web for business continues to steam ahead as a whole. 451 Research volunteers that the sector should see an additional $7.5B in revenue each of the next few years. That’s a large pie to be pieced, but those who want a little more of it will have to reinvent their business model and very likely the marketing strategy that goes along with it.

Continued growth for web and app hosting will primarily come from 2 sources:

  • Adding new subscribers to grow your customer base
  • Adding new services you can sell to existing customers

The Appeal of Hybrid Cloud Hosting

Most of you will need no introduction to the functionality and application of cloud computing, but those of who aren’t loaded with insight into the ins and outs of web hosting may be less familiar with cloud hosting and what makes it significantly different from standard web hosting. Fewer still will likely know of hybrid hosting and the way it’s made significant inroads into the hosting market with very specific appeals for certain web users with business and / or management interests.

Here at 4GoodHosting, we’ve done well establishing ourselves as a quality Canadian web hosting provider, and a part of what’s allowed us to do that is by having our thumb on the pulse of our industry and sharing those developments with our customers in language they can understand. Hybrid hosting may well be a good fit for you, and as such we’re happy to share what we know regarding it.

If we had to give a brief overview of it, we’d say that hybrid hosting is meant for site owners that want the highest level of data security along with the economic benefits of the public cloud. Privacy continues to be of a primary importance, but the mix of public and private cloud environments and the specific security, storage, and / or computing capacities that come along with the pairing are very appealing.

What Exactly is the Hybrid Cloud?

This combination of private and public cloud services communicate via encrypted technology that allows for data and / or app portability, consisting of three individual parts; the public cloud / the private cloud / a cloud service and management platform.

Both the public and private clouds are independent elements, allowing you to store and protect your data in your private cloud while employing all of the advanced computing resources of the public cloud. To summarize, it’s a very beneficial arrangement where your data is especially secure but you’re still able to bring in all the advanced functionality and streamlining of processes that come with cloud computing.

If you have no concerns regarding the security of your data, you are; a) lucky, and b) likely to be quite fine with a standard cloud hosting arrangement.

If that’s not you, read on…

The Benefits of Hybrid Clouds

One of the big pluses for hybrid cloud hosting is being able to keep your private data private in an on-prem, easily accessible private infrastructure, which means you don’t need to push all your information through the public Internet, yet you’re still able to utilize the economical resources of the public cloud.

Further, hybrid hosting allows you to leverage the flexibility of the cloud, taking advantage of computing resources only as needed, and – most relevantly – also without offloading ALL your data to a 3rd-party datacenter. You’re still in possession of an infrastructure to support your work and development on site, but when that workload exceeds the capacity of your private cloud, you’re still in good hands via the failover safety net that the public cloud provides.

Utilizing a hybrid cloud can be especially appealing for small and medium-sized business offices, with an ability to keep company systems like CRMS, scheduling tools, and messaging portals plus fax machines, security cameras, and other security / safety fixtures like smoke or carbon monoxide detectors connected and working together as needed without the same risk of web-connection hardware failure or security compromise.

The Drawbacks of Hybrid Clouds

The opposite side of the hybrid cloud pros and cons is that it can be something of a demanding task to maintain and manage such a massive, complex, and expensive infrastructure. Assembling your hybrid cloud can also cost a pretty penny, so it should only be considered if it promises to be REALLY beneficial for you, and keep in mind as well that hybrid hosting is also less than ideal in instances where data transport on both ends is sensitive to latency, which of course makes offloading to the cloud impractical for the most part.

Good Fits for Hybrid Clouds

It tends to be a more suitable fit for businesses that have an emphasis on security, or others with extensive and unique physical data needs. Here’s a list of a few sectors, industries, and markets that have been eagerly embracing the hybrid cloud model:

  • Finance sector – the appeal for them is in the decreased on-site physical storage needs and lowered latency
  • Healthcare industry – often to overcome regulatory hurdles put in place by compliance agencies
  • Law firms – protecting against data loss and security breaches
  • Retail market – for handling compute-heavy analytics data tasks

We’re fortunate that these types of technologies continue to evolve as they have, especially considering the ever-growing predominance of web-based business and communication infrastructures in our lives and the data storage demands and security breach risks that go along with them.

Seven Steps to a Reliably Secure Server

In a follow up to last week’s blog post where we talked about how experts expect an increase in DDoS attacks this year, it makes sense for us to this week provide some tips on the best way to secure a server. Here at 4GoodHosting, in addition to being a good Canadian web hosting provider we also try to take an interest in the well being of clients of ours who are in business online. Obviously, the premise of any external threat taking them offline for an extended period of time will endanger the livelihood of their business, and as such we hope these discussions will prove valuable.

Every day we’re presented with new reports of hacks and data breaches causing very unwelcome disruptions for businesses and users alike. Web servers tend to be vulnerable to security threats and need to be protected from intrusions, hacking attempts, viruses and other malicious attacks, but there’s no replacing a secure server with its role for a business that operates online and engages in network transactions.

They tend to be the target because they are many times all too penetrable for hackers, and add to that the fact they’re known to contain valuable information. As a result, taking proper measures to ensure you have a secure server is as vital as securing the website, web application, and of course the network around it.

Your first decisions to evaluate are the server, OS and web server you’ll choose to collectively function as server you hope will be secure, and then the kind of services that run on it. No matter which particular web server software and operating system you choose to run, you must take certain measures to increase your server security. For starters, everyone will need to review and configure every aspect of your server in order to secure it.

It’s best to maintain a multi-faceted approach that offers in-depth security because each security measure implemented stacks an additional layer of defence. The following is a list we’ve assembled from many different discussion with web development and security experts that individually and collectively will help strengthen your web server security and guard against cyberattacks, stopping them essentially before they even have the chance to get ‘inside’ and wreak havoc.

Let’s begin;

  1. 1. Automated Security Updates

Unfortunately, most vulnerabilities come with a zero-day status. Before you know it a public vulnerability can be utilized to create a malicious automated exploit. Your best defence is to keep an eye ALWAYS on the ball when it comes to receiving security updates and having them put into place. Now of course your eye isn’t available 24/7, but you can and should be applying automatic security updates and security patches as soon as they are available through the system’s package manager. If automated updates aren’t available, you need to find a better system – pronto.

  1. Review Server Status and Server Security

Being able to quickly review the status of your server and check whether there are any problems originating from CPU, RAM, disk usage, running processes and other metrics will often help pinpoint server security issues with the server in a much faster period of time. In addition, ubiquitous command line tools can also review the server status. Each of your network services logs, database logs, and site access logs (Microsoft SQL Server, MySQL, Oracle) present in a web server are best stored in a segregated area and checked with regularity. Be on the lookout for strange log entries. Should your server be compromised, having a reliable alerting and server monitoring system standing guard will prevent the problem from snowballing and allow you to take strategic reactive measures.

  1. Perimeter Security With Firewalls

Seeing to it you have a secure server means involves the installation of security applications like border routers and firewalls ready and proven effective for filtering known threats, automated attacks, malicious traffic, DDoS filters, and bogon IPs, plus any untrusted networks. A local firewall will be able to actively monitor for attacks like port scans and SSH password guessing and effectively neutralize their threat to the firewall. Further, a web application firewall helps to filter incoming web page requests that are made for the explicit purpose of breaking or compromising a website.

  1. Use Scanners and Security Tools

Fortunately, we’ve got many security tools (URL scan, mod security) typically provided with web server software to aid administrators in securing their web server installations. Yes, configuring these tools can be a laborious process and time consuming as well – particularly with custom web applications – but the benefit is that they add an extra layer of security and give you serious reassurances.

Scanners can help automate the process of running advanced security checks against the open ports and network services to ensure your server and web applications are secure. It most commonly will check for SQL injection, web server configuration problems, cross site scripting, and other security vulnerabilities. You can even get scanners that can automatically audit shopping carts, forms, dynamic web content and other web applications and then provide detailed reports regarding their detection of existing vulnerabilities. These are highly recommended.

  1. Remove Unnecessary Services

Typical default operating system installations and network configurations (Remote Registry Services, Print Server Service, RAS) will not be secure. Ports are left vulnerable to abuse with larger numbers of services running on an operating system. It’s therefore advisable to switch off all unnecessary services and then disable them. As an added bonus, you’ll be boosting your server performance by doing this with a freeing of hardware resources.

  1. Manage Web Application Content

The entirety of your web application or website files and scripts should be stored on a separate drive, away from the operating system, logs and any other system files. By doing so it creates a situation where even if hackers gain access to the web root directory, they’ll have absolutely zero success using any operating system command to take control of your web server.

  1. Permissions and Privileges

File and network services permissions are imperative points for having a secure server, as they help limit any potential damage that may stem from a compromised account. Malicious users can compromise the web server engine and use the account in order to carry out malevolent tasks, most often executing specific files that work to corrupt your data or encrypt it to their specifics. Ideally, file system permissions should be granular. Review your file system permissions on a VERY regular basis to prevent users and services from engaging in unintended actions. In addition, consider removing the “root” account to enable login using SSH and disabling any default account shells that you do not normally choose to access. Make sure to use the least privilege principle to run specific network service, and also be sure to restrict what each user or service can do.

Securing web servers can make it so that corporate data and resources are safe from intrusion or misuse. We’ve clearly established here that it is about people and processes as much as it is about any one security ‘product.’ By incorporating the majority (or ideally all) measures mentioned in this post, you can begin to create a secure server infrastructure that’s supremely effective in supporting web applications and other web services.