Quantum Computing Goes Desktop

To say much has been made of the potential of quantum computing would be a big understatement. We’re really only just scratching the surface of what its reach can be, and it holds so much promise for improving our lives along with major contributions to the efficiency of business. And nowadays quantum computing is coming on leaps and bounds. Previously the capacity of it meant that the physical hardware was expansive and the farthest thing from portable, but now that’s changed and having quantum computing go desktop is a big development.

This has the potential to reach into every industry and interest, and here at 4GoodHosting we are like any quality Canadian web hosting provider in that it has the potential to revise our landscape and those of the people who choose us to host their websites. Computing industry experts are calling an operating system available on a chip to be a ‘sensational breakthrough’

About 50 quantum computers have been built up to this point, and all of them use different software, as a quantum equivalent of Windows, IOS or Linux doesn’t exist. But this new development means an OS enabling the same quantum software to run on different quantum computing hardware types.

Let It Flow

The system has been named Deltaflow.OS and runs on a chip developed by consortium member SEEQC using almost nothing of the space required with previous hardware. The chip is about the size of a coin, and has all the same power and capacities as previous versions that were much larger. This new quantum computing chip is said to be about the size of a coin, and its relevance for the future of quantum computers is huge, especially as it look like they can be produced cost-effectively and at scale.

A little bit of explanation may be required here – quantum computers store information in the form of quantum bits, or ‘qubits’ as they are called. Qubits can exist in a pair of different information states at the same time. Being truly powerful requires scaling up to include many more qubits in order for it to make solving seriously challenging problems possible. Racks full of electronics were required to control Qubits previously, but now it’s all able to flow from a chip.

Grand Vision

The long-term goal is to have an operating system that makes quantum software portable across qubit technologies – scalable to millions of qubits. Part of that will be teasing the highest possible performance out of every qubit, and that will apply to applications like error correction that require fast feedback loops too.

The next question then is what will quantum computing be used for, and what are some specific benefit areas? A sufficient supply of qubits will allow quantum computers to process complex calculations at very high speeds, and so there is very real application for chemical testing without the use of a physical lab. As just one example.

What this entails is taking that vast processing power and using it to simulate digital versions of chemical compounds, test theories and predict chemical reactions without needing a physical lab and staff going through the processes of the tests. What this could do for the pharmaceutical industry is huge, especially when you consider it takes about $1 billion dollars to bring a major big-ticket new drug to market after many years of research, tests, and clinical trials. Quantum computing could speed this up and reduce research and development costs in a big way.

Better Batteries

If humans around the world are to achieve their carbon-neutral aims then the large scale switch to EV vehicles is going to mean the need for better batteries, and a lot of them. In much the same way the speed and reach of quantum computing can aid in drug development, the same virtual lab environment created by these computers may enable a much faster, less expensive, and more robust way to screen battery materials. Leading to improved research and development towards a cleaner future.

We can expect to see quantum developments in logistics, weather prediction, cybersecurity, and finance too. The technology will evolve in step with firmware developments for quantum processors that will later interface with Deltaflow.OS. There’s also something of a contest to see who will be first to transform quantum computers from experimental technology into commercial products. This is being referred to as the ‘quantum advantage’ and that’s a term you may be hearing a lot more of over the next little while.

Multi-Cloud Strategies to Dominate Future

Cloud computing is now nearly ubiquitous with its role in digital operations for all businesses, and what it’s done to replace the need for physical storage and a whole host of other benefits has obviously been a great development. It’s not often that a technological advance of this kind works so well when still in its infancy, but in fairness the cloud is something that was a natural outgrowth of many other pivotal gains in computing technology coming together and brining in other key components.

Evolution is always a part of these developments too, and when a new way of ‘doing thing’s is so eagerly adopted by so many people those evolutions tend to come fairly fast and furious. Here at 4GoodHosting we’re like every good Canadian web hosting provider in that we’re a little more front row than others for watching these types of major advances as they take shape, and some of us also have the web and computer savvy to have a better handle on what it all means and how it promises to add positive changes to the digital work world.

Being agile is an absolute necessity in the IT world, and especially considering the roadmap is always changing. Not to say that full one-80s are often required, but sometimes there is a need to plant your foot in the grass and pivot hard. Revisiting cloud infrastructure is always going to be a part of this, and more often than not it’s caused by workloads increasing massively in size overnight. That’s a term used loosely, and while it’s not exactly overnight the message is that big change requirements can come around surprisingly quickly and you need to be able to pivot and rechart without inflexibility.

At the forefront of all of this is a trend where multi-cloud strategies are seen as ideal fits for businesses, and that’s what we’ll look at here today.

Clearer Appeal

What is being seen now is multi-Cloud strategies emerging as a dominant part of many organizations’ long-term IT roadmaps. A recent study conducted by an IT services firm in the US and UK came back with some interesting findings regarding what’s to be in the near future for multi-cloud. The report paints a clear picture of the industry and makes clear how companies’ investments in cloud services are different than they were not even so long ago.

What was interesting to note first was the nearly half of respondents were indicating they have chosen a new provider in the last year, and this shows that shares of the cloud market are very much up for grabs between those major providers. What needs to be a primary focus for organisations right now is investing in the correct cloud strategy for their unique IT workloads.

Standard interests like pricing or vendor incentives, security and technical requirements are the top drivers when it comes to decision making related to cloud infrastructure, and it’s among these metrics that the bulk of decisions are made around what is going to serve the company best. Around 56% of respondents also indicated that security considerations will be impacting final decisions around choosing a provider in a big way.

42% Considering Multi-Cloud Strategy

So as we see that organizations are indeed moving toward multi-cloud strategies, it’s important not to overlook how private cloud environments hold onto their importance for organizations trying to make the best placement decisions when it comes to workload-from-cloud.

Here are key findings from the survey:

  • Microsoft Azure leads the way as the most popular public cloud provider (58%), and next are Google Cloud (41%), IBM (40%) and AWS (38%). Note that of the respondents only 1% stated having always been with the same cloud provider or platform.
  • Nearly half (46%) of respondents have gone with a new provider or platform within the last year — and more than 25% of them have made that move sometime in the past 6 months
  • Just 1% of respondents indicated having the same cloud provider or platform since their initial foray into cloud computing
  • 42% of respondents are pursuing a multi-cloud strategy
  • The vast majority of private cloud users (80% of them) stated better information security was their primary reason for going with a private cloud environment.
  • 89% of healthcare organizations and 81% of public sector interests foresee a continuing need for their private cloud over the next 5 years and beyond

Following all the major disruptions seen over the last year and some, the need for a modern technology stack is pushing cloud adoption hard in every industry. Capitalizing on the opportunities available in the market right now means cloud providers must meet these organization’s complex security and legacy workload needs.

Password Hygiene? It’s a Thing

You might not know it, but the word hygiene has Greek mythology roots. Hygieia was a daughter of Asclepius, who you probably also didn’t know was the Greek god of medicine. Hygieia was the goddess of health, cleanliness and sanitation, and so that pretty much makes sense in as far as where the word comes from. We all know how it’s important to brush our teeth everyday, but apparently it’s possible to be healthy, clean, and entirely sanitized with those digital gatekeepers we call passwords.

We’ve all seen password suggesters that give you an idea of how suitably strong your password is, but maybe far to many people are going with 54321 or something of the sort. Here at 4GoodHosting we’re like any other good Canadian web hosting provider in that we’ve come across all sorts of stories of password failures over the years and we try to make a point of giving customers some insights into good practices for the digital world if they need them.

And apparently the need is there. Passwords are still the primary form of authentication, but done poorly they can leave you vulnerable to attacks if your cybersecurity is not up to scratch. Passwords get stolen, and it’s happening a lot more often nowadays. They’re obtained by all sorts of underhanded means, and you may have some of yours that aren’t exclusively in your possession anymore too.

Billions Out There

At present there are billions of passwords available on the Dark Web, collected via various attack methods ranging from malware to phishing for them. Many are then used in password spraying and credential stuffing attacks.

The primary reason this is able to happen, according to web security experts, is that around 65% of users re use some of their passwords. That’s highly inadvisable, and if you do it then you put yourself at risk of stolen or compromised credentials. There’s another estimate that 1 in 5 companies who suffered a malicious data breach had it happen because of stolen or compromised credentials.

So what is poor password hygiene? It’s really any type of choice or omission with setting or sharing passwords that leaves doors wide open for attackers. If you’re the IT department with what you’ve got going on, your lack of knowledge about good password practices may be putting you at risk.

Put Effort into It

Choosing weak, easily guessable passwords like common keyboard patters or passwords that are obviously connected to an organization name, location or other common identifiers is where a lot of people mess up. Another common move is changing passwords only by adding sequential characters at the end. An example would be changing password1 to password.

A great example of this is what happened to the Marriot hotel chain. Just last year attackers obtained the login credentials of two Marriott employees and then compromised a reservation system and ultimately exposed payment information, names, mailing addresses, and much more for more than hundreds of millions of bonehead customers.

Why It Continues

Poor password hygiene is continuing to be a problem because it’s not visible enough as a problem or a potential threat. And thinking that attackers are only interested in targeting large organizations is incorrect too. Attackers do target SMBs and do it more often with the increasing frequency of online applications and remote technologies that can be compromised fairly easily a lot of the time.

The security of two-factor authentication is overrated and another common misconception for people. Two-factor authentication is a good security measure, but it’s certainly not fail safe. You still need your password to be as fully secure as possible.

And with Active Directory (AD), there is the belief that their password policy in AD is going to be sufficient. But it does not eliminate the use of compromised passwords or have anything to indicate the use of weak password construction patterns. You also shouldn’t think that implementing and enforcing a robust password security policy is going to create any degree of user friction.

Simplifying Password Security

Here are some fairly solid recommendations:

  • Choosing a password with a minimum length of 8 characters to encourage the use of longer passwords
  • Removing password expiration and complexity
  • Screening new passwords against a list of passwords known to be leaked / compromised

You also need to take risk level into account. Removing expiration guidelines can lead to a security gap given how long it takes organizations to identify a breach. It’s a good ideal to go with technical solutions that can reduce the poor password hygiene issues these can create.

Other good practices are:

  • Eliminating the use of common password construction patterns
  • Support user-oriented features such as passphrases (more memorable longer passwords) and length-based password. This also promotes less frequent password expiration because of how lengthy and strong the passwords
  • Continuously blocking use for leaked passwords
  • Making users able to reset their passwords with MFA (multi-factor authentication) from anywhere,
  • Work with existing settings you already use such as Group Policy

This is something that you want to be proactive about, and it’s really not asking too much of people to come up with a more solid and secure password. Go through what can happen if you have a weak password and you’ll know why that’s a headache you really want to avoid.

New Invention for More Climate-Friendly Data Servers

There’s no debating the fact that a higher population means higher power consumption. In the same way greater demands on data servers caused by so many more people using them indirectly are unavoidable too, and the way data centers are already using way too much power is definitely not good for the environment. Denmark isn’t anywhere close to being one of the most populated countries in Europe, but even there a single large data center consumes the equivalent of four percent of Denmark’s total electricity consumption.

That’s according to the Danish Council on Climate Change, and when you consider what that means you can imagine what the number would be like for much more heavily populated countries around the world. The growth of IoT and SaaS applications is going to increase this consumption and in the big picture it’s really not good. Here at 4GoodHosting, the nature of what we do as a quality Canadian web hosting provider means we can 100% relate to anything connected to operating large-scale data centers.

Plus, we prefer good to news to any news that is not as good and that’s why this particular piece of news out of Denmark made the cut for our blog entry this week. Let’s get into it, and it might make us all feel a little more at ease about our own small but significant contributions to power usage.

A+ Algorithm

What’s behind all of this is a new algorithm developed by Danish researchers that’s able to promote major reductions with the world’s computer servers and their resource consumption. We need to keep in mind that computer servers are every bit as taxing on the climate as all global airline traffic combined, and that is why green transitions in IT are so important.

So why exactly is this such an issue?

The world’s increasing internet usage has a very real and profound impact on climate due to the immense amount of electricity that’s demanded by computer servers. Current CO2 emissions from data centers are very high, and unfortunately they are expected to double within just a few years. Studies have indicated global data centers consume more than 400 terawatt-hours of electricity each year, accounting for approximately 2% of the world’s total greenhouse gas emissions

The person who gets the very real credit for developing this algorithm is Professor Mikkel Thorup and his team of researchers. They previously found a way to streamline computer server workflows and it resulted in considerable saved energy and resources. It was promptly incorporated by Google and Vimeo, among other tech giants.

Vimeo in particular stated that using this algorithm had cut back their bandwidth usage by an eight factor.

So what this team has done is built on that algorithm, and the reason it is noteworthy is because they have built on it in a big way. It is now capable of addressing a fundamental problem in computer systems and specifically with the way some servers become overloaded while other ones have capacity remaining.

Overload Stopper

The consensus is that this version of the algorithm is many times better and reduces resource usage as much as possible. And what’s also hugely beneficial is it being made available freely to whoever would like to make use of it. With worldwide internet traffic continuing to soar, the new algorithm developed by Thorup and his team addresses the problem of servers becoming overloaded with more client requests than they are able to handle.

Look no further than how many people are streaming content through Netflix or something similar every night. When this happens, systems commonly require a shifting of clients to make it so that servers have balanced distribution. It’s challenging to do, as often up to a billion servers are part of one individual system.

What results usually is congestion and server breakdowns, and with all the congestion and overload of requests comes a LOT of power and other resource consumption. Internet traffic is projected to triple between 2017 and 2022, and as it continues to increase, the problem will continue to grow but the algorithm provides the perfect scalable solution.

Minimal Movement

To this point these types of fixes have always involved a lot of different steps, but this new algorithm isn’t like that and that’s another reason why it’s being heralded the way it is.  It ensures that clients are distributed as evenly as possible among servers by moving them around as little as possible, and by making content retrieval as local as possible.

Ensuring client distribution among servers is balanced so that no server is more than 10% excessively burdened than others would have meant previous fixes dealing with an update by moving a client 100 times or more. Now that can be done in about 10 moves or less, and if the number of clients and servers in the system is a billion or more.

 

 

Stats Indicating Website Load Time Importance

In last week’s entry here we touched on ‘bounce’ rates, and how having higher ones can be hugely problematic for any website that’s serving an e-commerce function. If you’ve got a basic WordPress site for your blog on a basic shared web hosting package then you won’t be particularly concerned if those who enter choose to exit shortly thereafter. If your site is the avenue by which you sell products and services and make a livelihood for yourself, it’s going to be a much bigger issue if this becomes a trend.

Bounce rates are something that all conscientious web masters are going to monitor, even the ones who aren’t much of a master at all. We’re like any good Canadian web hosting provider in that we make these analytics resources available to people through our control panel, and most other providers will do the same so that it’s made easier to see if there’s some deficiency to your website that’s causing visitors to leave far too soon.

We found an interesting resource that puts the importance of website load times in real perspective for people who might not get the magnitude of it otherwise, and we thought we should share that here today.

2 or 3 at Most

The general consensus is that a website should load in less than 2 to 3 seconds, and more simply your website should load as fast as possible. Anything more than this time frame and your risk losing potential customers. There was a study done by Google in 2017 that indicated that as page load time goes from 1 second to 3, the likelihood of that visitor ‘bouncing’ increases by 32%.

It’s very possible that these numbers will have increased by this point 4 years later. The reality is the longer your page takes to load, the more likely it is users will find that to be unacceptable.

There’s no getting around the fact a fast user experience is extremely important. And even more so with the increasing predominance of mobile browsing. An analysis of 900,000 landing pages across several countries and industry sectors found that the majority of mobile sites are too slow. Apparently the average time needed for a mobile landing page to load is 22 seconds, and when you keep in mind that 2 to 3 seconds at-most guideline this shows that there’s probably a whole lot of bouncing going on.

The same study found that on average it took 10.3 seconds to fully load a webpage on desktop and 27.3 seconds on mobile. We can pair this with stats like the one that indicates mobile minutes account for 79% of online time in the United States, and it’s well known that desktop conversion rates are higher.

Plus:

  • The average time it takes for a desktop webpage to load is 10.3 seconds
  • The average web page takes 87.84% longer to load on mobile than desktop
  • 53% of mobile site visitors will leave a page if it takes longer than three seconds to load
  • Bounce rates are highest for mobile shoppers, in comparison to tablet and desktop users

Load Time & Relation to Customer Behavior

It’s commonly understood that page speed affects user experience, but exactly how detrimental that can be may still beyond some people’s understanding. The load time of a website impacts consumer behaviour directly. A page that takes longer to load will mean a higher bounce rate. Those running an online business should understand that a slow load time can result in a higher bounce rate, a general loss of traffic, and a significant loss in conversions.

Keep this in mind as well; The average attention span for a Generation Z person is just 8 seconds, and the average Millennial attention span is only slightly better than that. Most of these people won’t be inclined to wait for a page to load, and more often than not they will search for a new page.

Here’s more in a quick reference list on what may happen when your load time is too slow:

  • Approximately 70% of people say that the speed of a page affects their willingness to buy from an online retailer
  • Even a point 1 (0.1%) change in page load time can impact user journey and continued progression through a site
  • The number 1 reason in the U.S. why consumers choose to bounce from a mobile web page is excessive latency
  • When eCommerce sites load slower than expected, more than 45% of visitors will be less likely to proceed with purchasing
  • Even just a 2-second delay in page speed can push bounce rates up by more than double

Load Times & Relation to SEO

You shouldn’t need to be told that good SEO and higher SERPs are of primary importance all the time for any eCommerce website. It’s been more than a decade now since Google include page speed as a ranking factor for desktop, and it did the same for mobile-first indexing three years ago in 2018. Long story short, your website is going to be ranked based on mobile presence, not desktop.

This means that if your site is overly slow on mobile, your SERPs may take a hit. And while Google deemed website load time a ranking signal, marketers actually aren’t entirely convinced. One study of 11.8 million Google search results indicated that there was no correlation between speed and first page Google rankings.

Google’s ‘Speed Update’ tells a different story as the update only affect pages that deliver the slowest experience to users. However, it’s true that site speed also affects other ranking factors like bounce rates and total time on-site. The estimate is that the average page loading speed for the first page result on Google is 1.65 seconds. Long story short again, if you have an extremely slow webpage you are not going to make it onto the first page of Google. You may not even make into the 2 or 3 page

And considering that prospective customers aren’t inclined to dig very deep when finding what they’re looking for, that’s a genuine big deal. We’ll wrap up today by suggesting you consider this:

In last week’s entry here we touched on ‘bounce’ rates, and how having higher ones can be hugely problematic for any website that’s serving an e-commerce function. If you’ve got a basic WordPress site for your blog on a basic shared web hosting package then you won’t be particularly concerned if those who enter choose to exit shortly thereafter. If your site is the avenue by which you sell products and services and make a livelihood for yourself, it’s going to be a much bigger issue if this becomes a trend.

Bounce rates are something that all conscientious web masters are going to monitor, even the ones who aren’t much of a master at all. We’re like any good Canadian web hosting provider in that we make these analytics resources available to people through our control panel, and most other providers will do the same so that it’s made easier to see if there’s some deficiency to your website that’s causing visitors to leave far too soon.

We found an interesting resource that puts the importance of website load times in real perspective for people who might not get the magnitude of it otherwise, and we thought we should share that here today.

2 or 3 at Most

The general consensus is that a website should load in less than 2 to 3 seconds, and more simply your website should load as fast as possible. Anything more than this time frame and your risk losing potential customers. There was a study done by Google in 2017 that indicated that as page load time goes from 1 second to 3, the likelihood of that visitor ‘bouncing’ increases by 32%.

It’s very possible that these numbers will have increased by this point 4 years later. The reality is the longer your page takes to load, the more likely it is users will find that to be unacceptable.

There’s no getting around the fact a fast user experience is extremely important. And even more so with the increasing predominance of mobile browsing. An analysis of 900,000 landing pages across several countries and industry sectors found that the majority of mobile sites are too slow. Apparently the average time needed for a mobile landing page to load is 22 seconds, and when you keep in mind that 2 to 3 seconds at-most guideline this shows that there’s probably a whole lot of bouncing going on.

The same study found that on average it took 10.3 seconds to fully load a webpage on desktop and 27.3 seconds on mobile. We can pair this with stats like the one that indicates mobile minutes account for 79% of online time in the United States, and it’s well known that desktop conversion rates are higher.

Plus:

  • The average time it takes for a desktop webpage to load is 10.3 seconds
  • The average web page takes 87.84% longer to load on mobile than desktop
  • 53% of mobile site visitors will leave a page if it takes longer than three seconds to load
  • Bounce rates are highest for mobile shoppers, in comparison to tablet and desktop users

Load Time & Relation to Customer Behavior

It’s commonly understood that page speed affects user experience, but exactly how detrimental that can be may still beyond some people’s understanding. The load time of a website impacts consumer behaviour directly. A page that takes longer to load will mean a higher bounce rate. Those running an online business should understand that a slow load time can result in a higher bounce rate, a general loss of traffic, and a significant loss in conversions.

Keep this in mind as well; The average attention span for a Generation Z person is just 8 seconds, and the average Millennial attention span is only slightly better than that. Most of these people won’t be inclined to wait for a page to load, and more often than not they will search for a new page.

Here’s more in a quick reference list on what may happen when your load time is too slow:

  • Approximately 70% of people say that the speed of a page affects their willingness to buy from an online retailer
  • Even a point 1 (0.1%) change in page load time can impact user journey and continued progression through a site
  • The number 1 reason in the U.S. why consumers choose to bounce from a mobile web page is excessive latency
  • When eCommerce sites load slower than expected, more than 45% of visitors will be less likely to proceed with purchasing
  • Even just a 2-second delay in page speed can push bounce rates up by more than double

Load Times & Relation to SEO

You shouldn’t need to be told that good SEO and higher SERPs are of primary importance all the time for any eCommerce website. It’s been more than a decade now since Google include page speed as a ranking factor for desktop, and it did the same for mobile-first indexing three years ago in 2018. Long story short, your website is going to be ranked based on mobile presence, not desktop.

This means that if your site is overly slow on mobile, your SERPs may take a hit. And while Google deemed website load time a ranking signal, marketers actually aren’t entirely convinced. One study of 11.8 million Google search results indicated that there was no correlation between speed and first page Google rankings.

Google’s ‘Speed Update’ tells a different story as the update only affect pages that deliver the slowest experience to users. However, it’s true that site speed also affects other ranking factors like bounce rates and total time on-site. The estimate is that the average page loading speed for the first page result on Google is 1.65 seconds. Long story short again, if you have an extremely slow webpage you are not going to make it onto the first page of Google. You may not even make into the 2 or 3 page

And considering that prospective customers aren’t inclined to dig very deep when finding what they’re looking for, that’s a genuine big deal. We’ll wrap up today by suggesting you consider this:

  • Users spend more time on websites that load faster and the number of pages those users view decreases as page speed slows down
  • Sites with an average page load time of 2 seconds had customers view an average of 8.9 pages, compared to those with a load time of 7 seconds where users viewed only 3.7 pages

Better Website Navigation for E-Commerce Websites

Anyone who has a website serving as the primary point of contact between their goods and / or services and paying customers is probably going to want that site functioning as optimally as possible. As it relates to sales and incoming revenue, that’s going to be even more important if it’s an e-commerce website and you are as profit oriented as the next guy or gal. Most people have a lot invested in their business, and $ is only a part of that investment. People will want to get the maximum return on that investment, and good website navigation is definitely a factor.

We can relate to all of this here at 4GoodHosting, as being a quality Canadian web hosting provider we’re equally interested in returns on investments and we tend to have something of an affinity for anything digital. The fact you’re reading this means you’ve visited our website at least once, and we’ve put the same priority on solid website navigation that many others have to ensure we have as many new web hosting in Canada customers signing up with as possible.

Yes, the lowest prices on reliable web hosting in Canada do the lion’s share of the work there for bringing new customers into the fold, but the design of the site is a factor just like it is for any e-commerce website where you want to be retaining customers and making sure as few as possible become ‘bounce rate’ statistics.

Orders Up, Bouncers Down

So we’ll refer to those people who leave a website shortly after arriving as ‘bouncers’ then, with absolutely no relation to the huge man who’ll throw you out of the club if you get out of line in there. These people usually bounce because the website is a) visually unappealing to them to the point it suggest a lack of professionalism for the business, or b) the way they’re able to move through the site’s pages isn’t what they like.

Fortunately very little of the ‘like’ part of the equation has little to do with personal preferences or anything else of the sort. It has more to do with their inherent belief as to the way an e-commerce site should work when it comes to entering, looking over products or services, and then proceeding to buy or order them. Also good news is that for the vast majority of people their preferences and expectations are fairly similar in as far as site navigation is concerned.

After all, if there wasn’t such widespread agreement on this we wouldn’t be able to share these tips. Here they are:

Go with Slim Menus

Everything is important about your website has a direct connection to the site’s navigation. Choosing to try to fit it all into a single area can have very negative effects. A general guideline that’s good to stick by it that you shouldn’t have more than seven menu items in your navigation scheme. A lean menu is one that’s more conducive to being able to focus and move quickly while a menu loaded with options can put people off without them even being entirely aware of it.

The navigation that works best with menus is one that shows your main services or products and is descriptive enough but is still concise overall.

Descriptive Menu Items are Best

Google and other search engines crawl your site if it’s on the web, and when they do your descriptive menu items will be indexed. Those only using a general or generic term will find their site is lumped into that mix. For that reason it’s better to create terms that are more specific to your product or message. They will index more effectively and drive more of the right type of traffic to your website. The simple fact is products and services that are too general are going to apply equally to many, many types of businesses. Making you navigation terms descriptive will reduce bad clicks and bounce rates.

Be Wary of Dropdown Menus

Nesting all of your categories in a dropdown menu may mean your visitor doesn’t mouse over it and bounces shortly thereafter because they didn’t see what they wanted offered speedily enough. A simple navigation menu with descriptive terms will direct your user to a page where you can present more sub items. You can then design these pages to engage and convert the visitor rather than losing them early on because they weren’t inclined to put their eyes through their paces looking all up and down a menu.

Order is Important

A navigation basic follows the belief that the first and last items that appear are going to be the most effective. Whether a visitor’s attention is gathered and whether they’re retained long enough to get to check out is more important for items that appear at the beginning and at the end of your menu. Your most important (and most frequently ordered) items need to be at the beginning and end of the menu and the least frequently ordered items can take up the middle.

Search Bar Location Significance

When a menu fails to engage and your dropdowns are overlooked it’s going to be natural that a user will head for the search bar. Sometimes it’s the last chance you have before someone becomes a bouncer so it’s very important to have your search function bar readily visible on the home page of the website.

Search can be a highly valuable item especially on a site. E-commerce experts will tell you your search bar should be near the top of every page on your website too, not just the home page.

Content and Social Media

It’s true that a blog and social media links can be very beneficial overall for conversion rates. Engaging your audience works to build lasting relationships and eventually become a key to continued business and traffic growth. Visitors and shoppers who find your social media the perfect mix of appealing and engaging will be even more pleased to find it’s paired with quality web content when they visit your site. In such a scenario they are more likely to become loyal to you, be return customers and – perhaps most importantly – refer others to you.

Links to these areas need to be seen as an integral part of the site’s navigation, and overall it needs to be sharp to create a very inviting environment to go along with all the effort you’re likely already putting into SEO to drive inbound traffic. Check your analytics before and then again after making navigation adjustments to your site. Small changes can mean big differences.

Do’s and Don’ts for Hosted Exchange Migrations

Trends are trends, and the reason there’s often no stopping trends is because there’s a darn good reason everyone’s doing whatever it is. These days one such trend that’s got solid legitimacy behind it is moving from an on-premises Microsoft Exchange deployment to Exchange, and for most people it is nothing short of a huge undertaking. It’s often full of major issues along with considerations and decisions galore, and for a lot of people they won’t know what they’ve gotten into with moving to hosted Exchange until they’re well into the process.

But you’re going to do what you’re going to do, and especially if it’s something you feel you need to do. I remember when I was very young and my grandfather said to me ‘some birds do, and some birds don’t. Some birds will, and some birds won’t.’ I had absolutely no idea what on earth he was talking about but I stared up into the sky anyways. The few birds I saw were flying around being birds like any other and I remember thinking what is it they would or wouldn’t be doing in the first place.

But enough about that. Our discussion today is not necessarily about trends and about who is going to do what. It’s about getting your organization into Exchange Online and for some people it’s full of pitfalls that can make the whole thing far too unpleasant, especially if you have on choice but to continue on with it.

So here’s what we know about what you should do, and what you shouldn’t do.

Don’t underestimate the time required for moving the entirety of data over

A whole bunch of factors can make this a lengthy ordeal. How many users do you have? How much data does each mailbox have stored? Do you have bandwidth constraints? The list can go on. Migrating email to the cloud can take anywhere from a few days to several weeks. In fact, Microsoft can contribute one major slowdown of their own – a less-obvious protective feature of Exchange Online makes it so that inbound sustained connections are throttled in order to prevent system overwhelm risk. A noble aim, but it may have you getting frustrated pretty quick if you’re hoping to continue moving ahead with your migration.

However, once you’re up and running and fully in the cloud you’ll come to appreciate this defense line, which works to benefit the general subscription base. But when you are trying to ingest data you may have it slowing to a crawl. That’s just the way it is, and there may not be a way around so you’ll have to be patient.

Do use a delta-pass migration

A delta-pass migration rather than a strict cutover migration reduces time pressure on you down the line and further on into the migration. With delta-pass migration, multiple migration attempts are made while mail is still being delivered on-premises. For example, the first pass might move everything from Tuesday, Mar 1 backward and then another pass is made later in the week to move the “delta” — or changes — from that day through Wednesday, Mar 4, and then in succession until mailboxes are up to date.

This is a useful technique with each successive migration batch being smaller than the last and taking less time. Your users won’t lose historical mailbox data because theirs already holds their data.

Don’t skip configuring edge devices and intrusion detection systems to recognize & trust Exchange Online

Forgetting or choosing not to may mean your migrations are interrupted because your IDS thinks a DoS attack is happening. The fix though is that Microsoft makes available a regularly updated list of IP addresses used by all 365 services, and you can use it to configure your edge devices for trusting certain traffic flows.

Do start with running the Office network health and connectivity tests

Microsoft offers a comprehensive tool capable of alerting you to routing or latency issues between you and the Microsoft 365 data centers. Speeds, routing, latency, jitter, and more – all covered on your network connection to identify and isolate common issues that could lead to a lessened experience for Microsoft 365 users. This is particularly true for voice applications.

Do plan on implementing 2-factor authentication

A primary advantage to moving to Exchange Online and Microsoft 365 is how you are ablet to use all of the new security features available in the cloud. Tops of them of is the ability to turn on two-factor authentication. It will diminish your attack surface significantly as soon as you turn it on, and since Microsoft has seen to the rewiring of the directory and Exchange security model on its servers to make it work, all that’s required of you is flipping the switch and show your users where to enter mobile phone numbers.

An even better choice is to use the Microsoft Authenticator app to cut down on the security and social engineering risks of using SMS text messages. Now of course deploying Authenticator across thousands and thousands of phones can be difficult, especially with BYOD setups and environments geared for remote work where employees don’t have IT support on hand. SMS requires nothing from the end user and is done entirely by IT. So 2-Factor Authentication really is the better choice.

In a hybrid environment, don’t remove your last exchange server

Keeping at least one Exchange Server running on premises in order to manage users is a cardinal rule for Exchange users who’ve recently made their migration. It is possible to continue to use the Active Directory attribute editing functionality to manage recipients, but it’s not supported particularly well. At least not at this time.

It is preferable to use the Exchange admin console of your on-premises server to manage recipients in a hybrid environment, and without leaving an Exchange Server running in your on-premises deployment you can’t do that. Microsoft has said a solution for this should eventually be made available but even after all this time there’s been little progress toward solving that problem. Really is the only stain on Exchange as of this time, and it doesn’t take away from the overall advantages to it much if at all.

Managed Open Source Increasingly Driving Business Growth

Sharing the wealth is a pretty good rule to go by if you’re able to share it, and there’s been plenty of examples where if you don’t you end up with someone like Robin Hood who will share it for you. When it comes to the world of web development there’s never been any doubt about that, and that’s why source code is made available as open source as readily as it is. The widespread adoption has been of immense benefit to anyone who ‘builds’ anything worthy of mentioning for design and functionality.

Here at 4GoodHosting we’re like any good Canadian web hosting provider in that there’s some of us around here that speak Programmer, but there’s others that don’t speak it at all and that’s alright. Some weeks our entries here may be a little bit more digestible for the less web-savvy of you all, but this likely isn’t going to be one of them. If you’re a coder or if your someone who can appreciate what web development is doing for marketing and promotion capacities for your business then this is something that will be of interest.

Adopting new business strategies or implementing new technology is a proven effective way to grow and compete more effectively. More and more regularly it’s open source technology being tabbed as some seek a competitive edge and more of the latest innovations. A published survey not long ago found that 85% of enterprises reported using open source in their organization and in simple numbers adoption of the software really taken off over the last year. Almost half of these same teams are looking to rely more on open source in response to everything that’s changed (and they’ve learned) over the course of the COVID pandemic.

The Right Fit Now

You will be challenged to find anything around us that is NOT powered by open source today, from mobile phones to household appliances and more. Being able to build on the existing foundation of technology and not be hampered in making use of what you can to build your expansion on it is what open source is all about . Open source and permissive licenses give businesses real agility and the ability to move faster, experiment and innovate to be as competitive as possible in their space.

Open source is transparent and open to inspection, and as a result businesses benefit from the capability to utilize and process their own data independent of how it goes for a single vendor or a single product. Then add to that the open development model and contributions from small and large enterprises and a few select ‘big players’ like Amazon that make it so that open source is consistently at the very cutting edge of innovation.

One huge plus is that bugs in the code can be identified, diagnosed, and resolved quickly. Many have said this alone makes open source software more secure than any proprietary software. However it is true that open source can be more difficult to implement than proprietary software as it’s usually not so plug-and-play in the same way. In order to maintain it you will also need to keep on top of patches and updates.

Because open source software code is built for the community it does come with some challenges. The worldwide open source community doesn’t give direct support for individual businesses using the technology. There are forums, online guides, and elsewhere you can often look and find the information you need.

Add Management

And here is where managed open source enters the picture. It is an express solution to some of the key challenges associated with open source software and lets businesses obtain the best out of open source software without also having to take on responsibilities for maintenance. Managed open source providers handle implementation, maintenance, and security. This frees up the in-house developers to focus more on important work contributing to business growth rather than spending time ‘running things’ on either end.

Open Source and Cloud

It’s expected that the global public cloud infrastructure market will expand massively in 2021 with some expectations being around 35% growth and some $120 billion in sales. What’s driving cloud adoption is what is driving open source adoption in exactly the same way – business agility along with the ability to innovate and experiment at a speedier pace.

In the bigger picture businesses need to find a mix of solutions that fit them and their individual use-cases. For many businesses, that mix will include some combination of open source software and cloud technology. Implementing these technologies with the right support can promote growth, agility, and innovation. Businesses are coming to see how open source can help them and because this trend will continue if you do speak the language it would make sense to be brushing up on open source.

Siloscape: Newest Super Malware Arriving on Scene

No one needs to hear how Malware has become such more sophisticated and far-reaching nowadays, as the topics been beaten to death and everyone knows that cyber security experts are hard pressed to keep pace with them. Well, here we go again with one of the more menacing ones to come out of the void in more recent years. That’s Siloscape, named that way because this is malware that’s primary aim is to escape the container, and what better way than up and out.

To get technical, Siloscape is a heavily obfuscated malware built to open a backdoor into poorly configured Kubernetes clusters and then run malicious containers to go along with other sneaky and up-to-no-good activities. If an entire cluster is compromised the attacker gets served sensitive information like credentials, confidential files, or even entire databases hosted in the cluster. Experts are semi-jokingly comparing this to the novel coronavirus, as this malware bug is pretty darn novel in itself as there’s really nothing been like it before and that’s why it’s generating fanfare.

Unlikely to be as calamitous in the big picture as this darn pandemic though, which is a good thing.

All of this stuff tends to be fascinating enough for those of us here like it would be for any Canadian web hosting provider. Nature of the business and all, and while we have a formative understanding of web security practices there’s no one here who’d be able to pull up the drawbridge in any situation like this.

So let’s have a look at his Siloscape malware and lay out what you might need to know if you’re your own cyber security expert.

Cluster Buster

For anyone who might not know, the reason this is as serious as it is is because Kubernetes is one of the most popular open-source applications around, and for good reason. Containers have been wonderful and that’s why it’s unfortunate Siloscape is engineered to do what it does. So many organizations moving into the club are using Kubernetes clusters as their development and testing environments, and the threat of software supply chain attacks has to be seen as a huge threat.

Compromising an entire cluster is much more of a big deal than just an individual container. Clusters can be running multiple cloud applications and attackers might be able to steal critical information like usernames and passwords, an organization’s confidential and internal files or even entire databases hosted somewhere in that cluster. Then there’s also the possibility of leveraging it as a ransomware attack by taking the organization’s files hostage.

What You Need to Know

Some people don’t like sulfides, even though the foods that contain them tend to be good for your health. Onions are among them, and the reason we’re talking about foods here in any way is because Siloscape uses the Tor proxy and an .onion domain to anonymously connect to its command and control (C2) server. Knowledge is power when you’re going to defending against a foe, and so we’ll share more about what we know about Siloscape’s operation and what you might be able to be on the lookout for.

Siloscape malware is characterized by these behaviors and techniques:

  • Targets common cloud applications (usually web servers) for initial access, using known vulnerabilities (‘1-days’) and often ones that already have an existing working exploit
  • Uses Windows container escape techniques to get out of it and gain code execution on the underlying node
  • Abusing node’s credentials to spread in the cluster
  • C2 server connection via the IRC protocol over the Tor network
  • Waiting for further commands

It’s very likely that we’ll hear a lot more about this new malware in the coming weeks and months, and with all the recent news of major data hacks in the USA you have to hope that we don’t hear of it in one of those contexts.

A Fix?

Microsoft doesn’t recommend using Windows containers as a security feature, and recommend Hyper-V containers instead for anything that relies on containerization as a security boundary. Processes running in Windows Server containers can be predicted to have the same privileges as admin on the host – the Kubernetes node. If you are running applications that need to be secured in Windows Server containers then Hyper-V containers may be the safer choice.

Managed Hosting – The Pros and Cons

Managing something usually has the context of getting greater productivity out of whatever it is. So if it’s always a plus, does that apply to managed web hosting the same way? The appeal is easy to see, and if you don’t know what managed hosting is it’s where the web hosting provider manages the system for you. As you’d expect, that means additional cost for the client but many businesses and ventures will come to see that as money well spent. This is particularly true if time and manpower aren’t resources you have in abundance and you need to focus more on content rather than workings of the site.

Here at 4GoodHosting we’re like any other quality Canadian web hosting provider in that what we do makes us fairly knowledgeable on matter like this one. While we think managed web hosting is great for some people, we also feel that with a little bit of self-initiative paired with ease of use offered by cPanel standard with our web hosting packages that you can be fairly productive on your own. That likely won’t be true for those operating larger e-commerce websites but for anyone that’s – for example – selling their pottery online or something similar you’d be surprised how proactive you can be.

So that’s what we’re going to do with our entry this week, look at what’s good about managed web hosting and also looking at what’s not so good about it.

Pros of Managed Hosting

Being successful and / or profitable with website can take up a lot of time and effort, and primarily for content updates, design tweaks, and digital marketing activity. When you add ongoing technical maintenance on top of that, it can be overwhelming for some and especially if you’re the type who’s a webmaster in title while not being particularly web savvy in the first place. Everyone needs to start somewhere though.

The advantage of managed hosting is how it lets you focus on the specifics of your business without additional worries of any sort related to site performance or security. E-commerce merchants will always need speedy loading times but often won’t have the time, ability, or inclination to see to ensuring that on their own. Managed hosting is also called fully-serviced hosting and for anyone who’s not inclined or capable of doing the mechanic’s work for website performance it may well be worth what you will pay additionally each month.

Either way, it is true that you need to factor in the value of your time. Many of us don’t have much of it that isn’t spoken for when it comes to our livelihoods, so if that’s you and much of that livelihood is facilitated by a website(s) then managed hosting may be the right fit for you.

Cons of Managed Hosting

Oppositely, if have solid IT skills and see the value in having more immediate and hands-on control over your website then managed hosting may be something you don’t really need or want. Many people don’t want to relinquish control over the fundamentals, and there is some merit to that as while most web hosting providers run reputable operations there is always the chance of a bad actor doing harm.

(Yes, we can assure you that would never happen here)

As you might guess, this is in fact the most common reason people go with manual hosting. Let’s also consider a scenario where you want to make an immediate change but there’s a delay in response from your hosting company and a window of opportunity is lost. Another instance might be wanting to use a particular CMS but not having the needed support for it. Others will not want to have the chance of being in a situation where there’s no choice but to outsource important tasks due to time constraints or anything similar that’s putting you in a pinch.

Keep in mind two other concerns related to managed hosting. The first is cost, and for anyone who thinks changes to their initial setup are very unlikely it doesn’t make sense to continue pay additional for site management through managed hosting. Plus, the greater control that comes with manual hosting allows you to switch to a new host with minimal hassle or anything else if terms or prices change.

The last thing we’ll mention here today is one more advantage of good managed hosting, and specifically with the fact that you’ll have more immediate defensive responses if your site is attacked or a server fails. If you have any interest in migrating your website to a better Canadian web hosting provider then we’d be very happy to hear from you.