Multi-Cloud Strategies to Dominate Future

Reading Time: 3 minutes

Cloud computing is now nearly ubiquitous with its role in digital operations for all businesses, and what it’s done to replace the need for physical storage and a whole host of other benefits has obviously been a great development. It’s not often that a technological advance of this kind works so well when still in its infancy, but in fairness the cloud is something that was a natural outgrowth of many other pivotal gains in computing technology coming together and brining in other key components.

Evolution is always a part of these developments too, and when a new way of ‘doing thing’s is so eagerly adopted by so many people those evolutions tend to come fairly fast and furious. Here at 4GoodHosting we’re like every good Canadian web hosting provider in that we’re a little more front row than others for watching these types of major advances as they take shape, and some of us also have the web and computer savvy to have a better handle on what it all means and how it promises to add positive changes to the digital work world.

Being agile is an absolute necessity in the IT world, and especially considering the roadmap is always changing. Not to say that full one-80s are often required, but sometimes there is a need to plant your foot in the grass and pivot hard. Revisiting cloud infrastructure is always going to be a part of this, and more often than not it’s caused by workloads increasing massively in size overnight. That’s a term used loosely, and while it’s not exactly overnight the message is that big change requirements can come around surprisingly quickly and you need to be able to pivot and rechart without inflexibility.

At the forefront of all of this is a trend where multi-cloud strategies are seen as ideal fits for businesses, and that’s what we’ll look at here today.

Clearer Appeal

What is being seen now is multi-Cloud strategies emerging as a dominant part of many organizations’ long-term IT roadmaps. A recent study conducted by an IT services firm in the US and UK came back with some interesting findings regarding what’s to be in the near future for multi-cloud. The report paints a clear picture of the industry and makes clear how companies’ investments in cloud services are different than they were not even so long ago.

What was interesting to note first was the nearly half of respondents were indicating they have chosen a new provider in the last year, and this shows that shares of the cloud market are very much up for grabs between those major providers. What needs to be a primary focus for organisations right now is investing in the correct cloud strategy for their unique IT workloads.

Standard interests like pricing or vendor incentives, security and technical requirements are the top drivers when it comes to decision making related to cloud infrastructure, and it’s among these metrics that the bulk of decisions are made around what is going to serve the company best. Around 56% of respondents also indicated that security considerations will be impacting final decisions around choosing a provider in a big way.

42% Considering Multi-Cloud Strategy

So as we see that organizations are indeed moving toward multi-cloud strategies, it’s important not to overlook how private cloud environments hold onto their importance for organizations trying to make the best placement decisions when it comes to workload-from-cloud.

Here are key findings from the survey:

  • Microsoft Azure leads the way as the most popular public cloud provider (58%), and next are Google Cloud (41%), IBM (40%) and AWS (38%). Note that of the respondents only 1% stated having always been with the same cloud provider or platform.
  • Nearly half (46%) of respondents have gone with a new provider or platform within the last year — and more than 25% of them have made that move sometime in the past 6 months
  • Just 1% of respondents indicated having the same cloud provider or platform since their initial foray into cloud computing
  • 42% of respondents are pursuing a multi-cloud strategy
  • The vast majority of private cloud users (80% of them) stated better information security was their primary reason for going with a private cloud environment.
  • 89% of healthcare organizations and 81% of public sector interests foresee a continuing need for their private cloud over the next 5 years and beyond

Following all the major disruptions seen over the last year and some, the need for a modern technology stack is pushing cloud adoption hard in every industry. Capitalizing on the opportunities available in the market right now means cloud providers must meet these organization’s complex security and legacy workload needs.

Password Hygiene? It’s a Thing

Reading Time: 4 minutes

You might not know it, but the word hygiene has Greek mythology roots. Hygieia was a daughter of Asclepius, who you probably also didn’t know was the Greek god of medicine. Hygieia was the goddess of health, cleanliness and sanitation, and so that pretty much makes sense in as far as where the word comes from. We all know how it’s important to brush our teeth everyday, but apparently it’s possible to be healthy, clean, and entirely sanitized with those digital gatekeepers we call passwords.

We’ve all seen password suggesters that give you an idea of how suitably strong your password is, but maybe far to many people are going with 54321 or something of the sort. Here at 4GoodHosting we’re like any other good Canadian web hosting provider in that we’ve come across all sorts of stories of password failures over the years and we try to make a point of giving customers some insights into good practices for the digital world if they need them.

And apparently the need is there. Passwords are still the primary form of authentication, but done poorly they can leave you vulnerable to attacks if your cybersecurity is not up to scratch. Passwords get stolen, and it’s happening a lot more often nowadays. They’re obtained by all sorts of underhanded means, and you may have some of yours that aren’t exclusively in your possession anymore too.

Billions Out There

At present there are billions of passwords available on the Dark Web, collected via various attack methods ranging from malware to phishing for them. Many are then used in password spraying and credential stuffing attacks.

The primary reason this is able to happen, according to web security experts, is that around 65% of users re use some of their passwords. That’s highly inadvisable, and if you do it then you put yourself at risk of stolen or compromised credentials. There’s another estimate that 1 in 5 companies who suffered a malicious data breach had it happen because of stolen or compromised credentials.

So what is poor password hygiene? It’s really any type of choice or omission with setting or sharing passwords that leaves doors wide open for attackers. If you’re the IT department with what you’ve got going on, your lack of knowledge about good password practices may be putting you at risk.

Put Effort into It

Choosing weak, easily guessable passwords like common keyboard patters or passwords that are obviously connected to an organization name, location or other common identifiers is where a lot of people mess up. Another common move is changing passwords only by adding sequential characters at the end. An example would be changing password1 to password.

A great example of this is what happened to the Marriot hotel chain. Just last year attackers obtained the login credentials of two Marriott employees and then compromised a reservation system and ultimately exposed payment information, names, mailing addresses, and much more for more than hundreds of millions of bonehead customers.

Why It Continues

Poor password hygiene is continuing to be a problem because it’s not visible enough as a problem or a potential threat. And thinking that attackers are only interested in targeting large organizations is incorrect too. Attackers do target SMBs and do it more often with the increasing frequency of online applications and remote technologies that can be compromised fairly easily a lot of the time.

The security of two-factor authentication is overrated and another common misconception for people. Two-factor authentication is a good security measure, but it’s certainly not fail safe. You still need your password to be as fully secure as possible.

And with Active Directory (AD), there is the belief that their password policy in AD is going to be sufficient. But it does not eliminate the use of compromised passwords or have anything to indicate the use of weak password construction patterns. You also shouldn’t think that implementing and enforcing a robust password security policy is going to create any degree of user friction.

Simplifying Password Security

Here are some fairly solid recommendations:

  • Choosing a password with a minimum length of 8 characters to encourage the use of longer passwords
  • Removing password expiration and complexity
  • Screening new passwords against a list of passwords known to be leaked / compromised

You also need to take risk level into account. Removing expiration guidelines can lead to a security gap given how long it takes organizations to identify a breach. It’s a good ideal to go with technical solutions that can reduce the poor password hygiene issues these can create.

Other good practices are:

  • Eliminating the use of common password construction patterns
  • Support user-oriented features such as passphrases (more memorable longer passwords) and length-based password. This also promotes less frequent password expiration because of how lengthy and strong the passwords
  • Continuously blocking use for leaked passwords
  • Making users able to reset their passwords with MFA (multi-factor authentication) from anywhere,
  • Work with existing settings you already use such as Group Policy

This is something that you want to be proactive about, and it’s really not asking too much of people to come up with a more solid and secure password. Go through what can happen if you have a weak password and you’ll know why that’s a headache you really want to avoid.

New Invention for More Climate-Friendly Data Servers

Reading Time: 3 minutes

There’s no debating the fact that a higher population means higher power consumption. In the same way greater demands on data servers caused by so many more people using them indirectly are unavoidable too, and the way data centers are already using way too much power is definitely not good for the environment. Denmark isn’t anywhere close to being one of the most populated countries in Europe, but even there a single large data center consumes the equivalent of four percent of Denmark’s total electricity consumption.

That’s according to the Danish Council on Climate Change, and when you consider what that means you can imagine what the number would be like for much more heavily populated countries around the world. The growth of IoT and SaaS applications is going to increase this consumption and in the big picture it’s really not good. Here at 4GoodHosting, the nature of what we do as a quality Canadian web hosting provider means we can 100% relate to anything connected to operating large-scale data centers.

Plus, we prefer good to news to any news that is not as good and that’s why this particular piece of news out of Denmark made the cut for our blog entry this week. Let’s get into it, and it might make us all feel a little more at ease about our own small but significant contributions to power usage.

A+ Algorithm

What’s behind all of this is a new algorithm developed by Danish researchers that’s able to promote major reductions with the world’s computer servers and their resource consumption. We need to keep in mind that computer servers are every bit as taxing on the climate as all global airline traffic combined, and that is why green transitions in IT are so important.

So why exactly is this such an issue?

The world’s increasing internet usage has a very real and profound impact on climate due to the immense amount of electricity that’s demanded by computer servers. Current CO2 emissions from data centers are very high, and unfortunately they are expected to double within just a few years. Studies have indicated global data centers consume more than 400 terawatt-hours of electricity each year, accounting for approximately 2% of the world’s total greenhouse gas emissions

The person who gets the very real credit for developing this algorithm is Professor Mikkel Thorup and his team of researchers. They previously found a way to streamline computer server workflows and it resulted in considerable saved energy and resources. It was promptly incorporated by Google and Vimeo, among other tech giants.

Vimeo in particular stated that using this algorithm had cut back their bandwidth usage by an eight factor.

So what this team has done is built on that algorithm, and the reason it is noteworthy is because they have built on it in a big way. It is now capable of addressing a fundamental problem in computer systems and specifically with the way some servers become overloaded while other ones have capacity remaining.

Overload Stopper

The consensus is that this version of the algorithm is many times better and reduces resource usage as much as possible. And what’s also hugely beneficial is it being made available freely to whoever would like to make use of it. With worldwide internet traffic continuing to soar, the new algorithm developed by Thorup and his team addresses the problem of servers becoming overloaded with more client requests than they are able to handle.

Look no further than how many people are streaming content through Netflix or something similar every night. When this happens, systems commonly require a shifting of clients to make it so that servers have balanced distribution. It’s challenging to do, as often up to a billion servers are part of one individual system.

What results usually is congestion and server breakdowns, and with all the congestion and overload of requests comes a LOT of power and other resource consumption. Internet traffic is projected to triple between 2017 and 2022, and as it continues to increase, the problem will continue to grow but the algorithm provides the perfect scalable solution.

Minimal Movement

To this point these types of fixes have always involved a lot of different steps, but this new algorithm isn’t like that and that’s another reason why it’s being heralded the way it is.  It ensures that clients are distributed as evenly as possible among servers by moving them around as little as possible, and by making content retrieval as local as possible.

Ensuring client distribution among servers is balanced so that no server is more than 10% excessively burdened than others would have meant previous fixes dealing with an update by moving a client 100 times or more. Now that can be done in about 10 moves or less, and if the number of clients and servers in the system is a billion or more.

 

 

Stats Indicating Website Load Time Importance

Reading Time: 8 minutes

In last week’s entry here we touched on ‘bounce’ rates, and how having higher ones can be hugely problematic for any website that’s serving an e-commerce function. If you’ve got a basic WordPress site for your blog on a basic shared web hosting package then you won’t be particularly concerned if those who enter choose to exit shortly thereafter. If your site is the avenue by which you sell products and services and make a livelihood for yourself, it’s going to be a much bigger issue if this becomes a trend.

Bounce rates are something that all conscientious web masters are going to monitor, even the ones who aren’t much of a master at all. We’re like any good Canadian web hosting provider in that we make these analytics resources available to people through our control panel, and most other providers will do the same so that it’s made easier to see if there’s some deficiency to your website that’s causing visitors to leave far too soon.

We found an interesting resource that puts the importance of website load times in real perspective for people who might not get the magnitude of it otherwise, and we thought we should share that here today.

2 or 3 at Most

The general consensus is that a website should load in less than 2 to 3 seconds, and more simply your website should load as fast as possible. Anything more than this time frame and your risk losing potential customers. There was a study done by Google in 2017 that indicated that as page load time goes from 1 second to 3, the likelihood of that visitor ‘bouncing’ increases by 32%.

It’s very possible that these numbers will have increased by this point 4 years later. The reality is the longer your page takes to load, the more likely it is users will find that to be unacceptable.

There’s no getting around the fact a fast user experience is extremely important. And even more so with the increasing predominance of mobile browsing. An analysis of 900,000 landing pages across several countries and industry sectors found that the majority of mobile sites are too slow. Apparently the average time needed for a mobile landing page to load is 22 seconds, and when you keep in mind that 2 to 3 seconds at-most guideline this shows that there’s probably a whole lot of bouncing going on.

The same study found that on average it took 10.3 seconds to fully load a webpage on desktop and 27.3 seconds on mobile. We can pair this with stats like the one that indicates mobile minutes account for 79% of online time in the United States, and it’s well known that desktop conversion rates are higher.

Plus:

  • The average time it takes for a desktop webpage to load is 10.3 seconds
  • The average web page takes 87.84% longer to load on mobile than desktop
  • 53% of mobile site visitors will leave a page if it takes longer than three seconds to load
  • Bounce rates are highest for mobile shoppers, in comparison to tablet and desktop users

Load Time & Relation to Customer Behavior

It’s commonly understood that page speed affects user experience, but exactly how detrimental that can be may still beyond some people’s understanding. The load time of a website impacts consumer behaviour directly. A page that takes longer to load will mean a higher bounce rate. Those running an online business should understand that a slow load time can result in a higher bounce rate, a general loss of traffic, and a significant loss in conversions.

Keep this in mind as well; The average attention span for a Generation Z person is just 8 seconds, and the average Millennial attention span is only slightly better than that. Most of these people won’t be inclined to wait for a page to load, and more often than not they will search for a new page.

Here’s more in a quick reference list on what may happen when your load time is too slow:

  • Approximately 70% of people say that the speed of a page affects their willingness to buy from an online retailer
  • Even a point 1 (0.1%) change in page load time can impact user journey and continued progression through a site
  • The number 1 reason in the U.S. why consumers choose to bounce from a mobile web page is excessive latency
  • When eCommerce sites load slower than expected, more than 45% of visitors will be less likely to proceed with purchasing
  • Even just a 2-second delay in page speed can push bounce rates up by more than double

Load Times & Relation to SEO

You shouldn’t need to be told that good SEO and higher SERPs are of primary importance all the time for any eCommerce website. It’s been more than a decade now since Google include page speed as a ranking factor for desktop, and it did the same for mobile-first indexing three years ago in 2018. Long story short, your website is going to be ranked based on mobile presence, not desktop.

This means that if your site is overly slow on mobile, your SERPs may take a hit. And while Google deemed website load time a ranking signal, marketers actually aren’t entirely convinced. One study of 11.8 million Google search results indicated that there was no correlation between speed and first page Google rankings.

Google’s ‘Speed Update’ tells a different story as the update only affect pages that deliver the slowest experience to users. However, it’s true that site speed also affects other ranking factors like bounce rates and total time on-site. The estimate is that the average page loading speed for the first page result on Google is 1.65 seconds. Long story short again, if you have an extremely slow webpage you are not going to make it onto the first page of Google. You may not even make into the 2 or 3 page

And considering that prospective customers aren’t inclined to dig very deep when finding what they’re looking for, that’s a genuine big deal. We’ll wrap up today by suggesting you consider this:

In last week’s entry here we touched on ‘bounce’ rates, and how having higher ones can be hugely problematic for any website that’s serving an e-commerce function. If you’ve got a basic WordPress site for your blog on a basic shared web hosting package then you won’t be particularly concerned if those who enter choose to exit shortly thereafter. If your site is the avenue by which you sell products and services and make a livelihood for yourself, it’s going to be a much bigger issue if this becomes a trend.

Bounce rates are something that all conscientious web masters are going to monitor, even the ones who aren’t much of a master at all. We’re like any good Canadian web hosting provider in that we make these analytics resources available to people through our control panel, and most other providers will do the same so that it’s made easier to see if there’s some deficiency to your website that’s causing visitors to leave far too soon.

We found an interesting resource that puts the importance of website load times in real perspective for people who might not get the magnitude of it otherwise, and we thought we should share that here today.

2 or 3 at Most

The general consensus is that a website should load in less than 2 to 3 seconds, and more simply your website should load as fast as possible. Anything more than this time frame and your risk losing potential customers. There was a study done by Google in 2017 that indicated that as page load time goes from 1 second to 3, the likelihood of that visitor ‘bouncing’ increases by 32%.

It’s very possible that these numbers will have increased by this point 4 years later. The reality is the longer your page takes to load, the more likely it is users will find that to be unacceptable.

There’s no getting around the fact a fast user experience is extremely important. And even more so with the increasing predominance of mobile browsing. An analysis of 900,000 landing pages across several countries and industry sectors found that the majority of mobile sites are too slow. Apparently the average time needed for a mobile landing page to load is 22 seconds, and when you keep in mind that 2 to 3 seconds at-most guideline this shows that there’s probably a whole lot of bouncing going on.

The same study found that on average it took 10.3 seconds to fully load a webpage on desktop and 27.3 seconds on mobile. We can pair this with stats like the one that indicates mobile minutes account for 79% of online time in the United States, and it’s well known that desktop conversion rates are higher.

Plus:

  • The average time it takes for a desktop webpage to load is 10.3 seconds
  • The average web page takes 87.84% longer to load on mobile than desktop
  • 53% of mobile site visitors will leave a page if it takes longer than three seconds to load
  • Bounce rates are highest for mobile shoppers, in comparison to tablet and desktop users

Load Time & Relation to Customer Behavior

It’s commonly understood that page speed affects user experience, but exactly how detrimental that can be may still beyond some people’s understanding. The load time of a website impacts consumer behaviour directly. A page that takes longer to load will mean a higher bounce rate. Those running an online business should understand that a slow load time can result in a higher bounce rate, a general loss of traffic, and a significant loss in conversions.

Keep this in mind as well; The average attention span for a Generation Z person is just 8 seconds, and the average Millennial attention span is only slightly better than that. Most of these people won’t be inclined to wait for a page to load, and more often than not they will search for a new page.

Here’s more in a quick reference list on what may happen when your load time is too slow:

  • Approximately 70% of people say that the speed of a page affects their willingness to buy from an online retailer
  • Even a point 1 (0.1%) change in page load time can impact user journey and continued progression through a site
  • The number 1 reason in the U.S. why consumers choose to bounce from a mobile web page is excessive latency
  • When eCommerce sites load slower than expected, more than 45% of visitors will be less likely to proceed with purchasing
  • Even just a 2-second delay in page speed can push bounce rates up by more than double

Load Times & Relation to SEO

You shouldn’t need to be told that good SEO and higher SERPs are of primary importance all the time for any eCommerce website. It’s been more than a decade now since Google include page speed as a ranking factor for desktop, and it did the same for mobile-first indexing three years ago in 2018. Long story short, your website is going to be ranked based on mobile presence, not desktop.

This means that if your site is overly slow on mobile, your SERPs may take a hit. And while Google deemed website load time a ranking signal, marketers actually aren’t entirely convinced. One study of 11.8 million Google search results indicated that there was no correlation between speed and first page Google rankings.

Google’s ‘Speed Update’ tells a different story as the update only affect pages that deliver the slowest experience to users. However, it’s true that site speed also affects other ranking factors like bounce rates and total time on-site. The estimate is that the average page loading speed for the first page result on Google is 1.65 seconds. Long story short again, if you have an extremely slow webpage you are not going to make it onto the first page of Google. You may not even make into the 2 or 3 page

And considering that prospective customers aren’t inclined to dig very deep when finding what they’re looking for, that’s a genuine big deal. We’ll wrap up today by suggesting you consider this:

  • Users spend more time on websites that load faster and the number of pages those users view decreases as page speed slows down
  • Sites with an average page load time of 2 seconds had customers view an average of 8.9 pages, compared to those with a load time of 7 seconds where users viewed only 3.7 pages