Ways to Minimize Data Usage with Android OS Devices

There are a lot of people who are decidedly in one camp or another when it comes to iOS or Android smartphones, but it’s probably fair to say that many people go with Android ones because they’re not that savvy about the technology and they like the fact that generally Android phones are less expensive than iPhones. That may change some day, but as of now that’s the way it is and given Apple’s predispositions when it comes to their devise it’s unlikely it will. So if the small green droid is your guy then just go with your inclination.

You’ll be hard pressed to find any more than a few people who have less than a couple of gigs of data on their mobile plans these days, and many people will have 10+ as a minimum if they’re inclined to do more than check mails and use the odd messenger app from time to time. But if you’re an Android user and you’re roaring through your monthly data allowance then you might be interested in what we’re going to share here this week – ways to use less data on your Android device.

It’s something that will be of interest to some and of no interest to others, but here at 4GoodHosting we’re like every other good Canadian web hosting provider in that we know most people choosing us for web hosting in Canada will be being carried by the increasing wave of digitalization that we seemingly al are these days. That means more connectivity, and a need for more data. Or being smarter with the existing data you have, as the case may be.

So let’s get right into it and put you in the know about making your data allowance last longer, even if you paid way less for your Android OS device in the first place. After all, who is made of money these days?

Diagnose Data Usage

The first step is to have an understanding of your data usage each month, and where and how you’re utilizing your data. On older Android versions you can open up the Network and Internet section and tap a line labeled ‘data usage’ before selecting ‘mobile data usage’ on the appearing screen. For newer devices you follow the same path but go a little bit further into ‘app data usage’. If it’s fully updated to Android 12 then you’ll have a gear-shaped icon with your mobile carrier’s name that you click on.

From there, just have a look and if you’re going through much more data than you used to then you can get a definitive look at where most of it is going. Is any of that superfluous stuff that you don’t really need to be doing? This is where you might want to start making some priority changes if you’re not willing or able to add more data to your plan.

Do Away with Background ‘Trickles’

Unnecessary background app activity has always been a cause of diminishing data limits. Social and news apps tend to be the worst in this regard, checking in at regular intervals to prep content delivery if you choose to open them again. If you don’t, that is data wasted. So here’s what you should do – check these apps and look for data-saving options in their settings. One popular choice for certain apps like the Twitter app for Android is to uncheck the ‘sync data’ option that you should be able find quite easily.

And here’s a general rule that everybody should take note of; no matter what you do to change user preferences or anything else of the like, Facebook is an obscene data guzzler and you should really try to limit your time on it with mobile if you’re concerned about using too much data. Save the sifting through posts and the like for when you’re at home and on your Wi-Fi network.

Compress Your Mobile Web Experience

A quick, easy and guaranteed effective way to force browsers to be not so data hungry is to reorient Google’s Chrome browser for Android into its Lite Mode, where pages are routed through Google’s servers so that the pages are compressed before they are presented to you. Here’s how:

  • Go into Chrome setting and look for the line listed as ‘Lite Mode’
  • Tap to activate it, and leave it that way all the time if you’re so inclined

It really is that simple, and estimates are that steady and consistent use of Lite Mode can add up to data savings of up to 60%, and the other benefit is you end up browsing much faster too when you have compressed your mobile web experience.

Advance Downloads of Media

This one might seem very obvious, but mobile streaming will absolutely ruin your data budgeting if you engage in it too often. It is hugely advantageous to download content in advance, and there are plenty of multimedia apps that make that fairly easy. Those who have the user freedoms that come with YouTube Premium or YouTube Music Premium can be proactive here by going to the ‘background and downloads’ section of the app to adjust the setting and by tapping the 3-line menu icon you can find the download button to conduct your downloads while still at home on your Wi-Fi network and then not use even an ounce of data.

Put the Brakes on Play Store

Auto updates can dig into your data too, and the Google Play Store is one of the worst culprits here. Open up the app on your device and select Settings and then Network Preferences. Once you’re there you can choose to have auto updates limited to ‘over Wi-Fi only’ and you can also choose the same for auto-play videos. Highly recommended and is something that’s easily done. If recently updated menu choices with apps, games and the like are important then you can choose ‘ask me every time’ if you want to prevent ongoing auto updates here.

Go with Light Versions

Many services now offer scaled-down versions of apps and sites that you can use or visit without using so much data. Look for Google’s ‘Go’ branded apps here, including Google Go, Google Maps Go, Navigation for Google Maps Go, Gallery Go, Gmail Go, Google Assistant Go, and YouTube Go. All come ready made and will allow a sufficient user experience while not going to hard and heavy on your data allowance.

$1.44 Billion Ready to Go a Long Way for Satellite Internet in Canada

If digital connectivity isn’t an integral part of your life these days then you haven’t been living on earth, or at least anywhere outside of North Korea. What’s kind of ironic is the fact that those folks are kept off the information superhighway entirely for the most part, while their cousins and next door neighbours in South Korea have the best Internet on earth. The logistical challenges that come with a country as large as Canada make those types of internet networks and speeds comparatively impossible, but there’s recently been a major development at the federal level that promises to make quality internet available to more Canadian than ever before.

Here at 4GoodHosting we imagine we’re like most Canadian web hosting providers in that the recent news that the federal government is directing 1.4 billion-plus dollars to Telesat satellite internet is something we NEED to talk about. The very basis of the service we provide is dependent on the web functioning as it should, but it’s been a well known fact for years that people in urban centres enjoy much better connectivity than those in more rural areas of the country. Considering that the web is for so much more than browsing or streaming and is increasingly key to participating in life and personal advancement, that’s not right.

Telesat is a Crown-owned Canadian satellite communication company, and what is in design is their Lightspeed low-earth-orbit (LEO) satellite constellation. The investment is in the development of the first and second generations of satellites, plus establishing domestic employment targets and – in the bigger picture – aiming to reduce costs for regional providers who rely on Telesat products to provide internet connectivity to customers.

This has the potential to be a MAJOR development for people in quieter parts of the country who have been disappointed in the inferior internet they’ve had for more than 20 years now. Previously it was a problem without a real practical solution, but satellite internet is poised to be that solution now.

298-Strong ‘Constellation’

Telesat’s Lightspeed constellation is going to be made up of 298 low earth orbit (LEO) satellites that will be equipped to deliver gigabit speeds with impressive 30 to 50-millisecond latency, and make that available anywhere in the world. The network will run on global priority Ka-band spectrum and boast a capacity of 15 Terabits per second. That would have sounded totally unrealistic even 10 years ago, but here we and isn’t progress a wonderful thing?

In our country the Lightspeed constellation will finally deliver the connectivity the country’s most isolated areas have been looking for over a long, long time now. Internet and cellular services enabled through Telesat will begin service in about 4 years from now and connect 40,000 households in underserved communities. This is perfectly right in line with the government’s previously stated goal of providing high-speed internet to all Canadians by the end of this decade.

Jobs Too

Telesat is also going to invest $3.6 billion in capital expenditure in Canada, and this project’s development and then long-term infrastructure maintenance should provide up to 700 jobs. There is also going to be a focus placed on women in STEM programs when it comes to filling the engineering-related positions.

Partner to Starlink?

SpaceX’s Starlink probably needs no introduction as this game-changer has already been discussed in the news at great length. Starlink is already making its way into Canadian homes, although with limited availability at this point. Starlink launched its testing phase in Canada earlier in 2021, allowing eligible Canadian customers to register for satellite internet subscription. If anyone’s tried it and would care to let us know how it’s been for them, we’d love to hear it.

One big difference between Starlink and Telesat’s Lightspeed will be that Telesat will be making their powerhouse available to regional internet service providers. That is quite different from Starlink, which will sell its service directly to consumers.

It’s also received funding from provincial governments individually. Ontario to the tune of $109 million , and Quebec $200 million plus a separate $200 million investment in the company made in the company by La Belle Province.

Could it be the appetite for genuinely high-speed and reliable internet is stronger in rural Quebec than elsewhere in the country? Who knows, but this is definitely a potentially major development in making it available to all Canadians, no matter where they live.

Avoiding VPNs with Tracker Apps

The appeal of VPNs won’t need much explanation, and as data storage and access needs continue to grow all the time we will see more and more organizations making the move to them. There’s been plenty of times the masses have been told that some new wrinkle in their digital product is innocuous or harmless, but at least some of the time that’s just not the truth. Many VPNs employ tracker apps, and the reasoning given for them is that they are part of offering you a more personalized experience and the like.

That in itself is true in the bigger picture, but in the smaller one for some it may be that tracker apps are actually putting you at risk. Here at 4GoodHosting we’re a Canadian web hosting provider that takes the well being of our customers to heart and given that some of you may the ones making IT decisions for your business, venture, or interest then this is a topic that’s worthy of discussion. The industry consensus very much seems to be that tolerating VPNs with tracker apps is something you shouldn’t be so indifferent to.

But of course, convincing is always required and so with that understood let’s get right to laying out why that’s the consensus, and why a less intrusive VPN may be best for you.

Important Protocols

Most of these types of people will be at least somewhat familiar with protocols and encryption methods used by VPNs. Usually that level of understanding doesn’t make it at all clear as to why VPNs with trackers are creating risks. But let’s start at the start and begin with what a tracker is and what a tracker does. It’s actually a fairly basic explanation – a tracker is something that tracks the choices you make when moving around the Internet. It’s true that most websites and apps use trackers in some way and they’re inclined to follow you nearly everywhere you go.

The information gathered by the trackers about you is then used for targeted advertisements and the like. The trackers are built by software developers at the behest of businesses that want to create greater profits by increasing the chances that like-minded people are made aware of what they have to offer.

1st and 3rd Party Trackers

Understanding the difference between first- and third-party trackers is also helpful in having a better understanding of them, and why they are not completely harmless in the way some people might think they are. The distinction between them is important. The ‘Cookies’ we’ve ALL heard of are examples of first party trackers and used to remember things like your language, layout preferences, and even for saving your shopping cart.

It’s fair to say that cookies are by and large necessary for many websites to give you the type of visitor experience you’re expecting and refusing cookies from being stored is fairly straightforward if you have concerns about their function.

Third-party trackers are entirely different. They’ve been built into websites and apps for the explicit purpose of making money from you. What they are after nearly all of the time is PII – personally identifiable information. Examples could be your IP address, what browser you are using, where you choose to click, long you are on a certain web page, device specs and more. As you’d imagine, this is where most people start to think these trackers are overstepping their bounds.

Free to Profile

And that will also be because the information that’s going to be collected with 3rd party trackers will be used to create a profile for you, and from it comes targeted ads that are delivered to gain business and revenue from you. And yes, Google is ALL about 3rd-party trackers with a ‘more the merrier’ attitude related to having them in place.

A lot of mobile apps will also make use of 3rd-party trackers, and in some ways you need to be even more aware when it comes to using a VPN that implements trackers in their apps. VPN apps that utilize trackers are compromising your privacy to make money, and that’s really the long and short of it. They are not required for the app to function properly and then they are actively leaking your information to Google, Facebook, or whoever else among big data companies.

The extent of the information being collected will vary from app to app. But the use of trackers regardless means information about you is being shared, and this isn’t being communicated to users whatsoever.

More Capable Than You Think

Plenty of these third-party trackers are sophisticated to the point that they have a wide net of data to pull from that and often your IP address isn’t even needed to create a targeted profile for you. These trackers can use the huge amount of information they have and the unique ID for you to connect the dots and still trace everything back to you. It is good to know that even if something as easily traceable as an IP address isn’t being shared, there may still be the ability to connect dots and track the person’s behavior online.

This is why ever greater numbers of decision makers are deciding that a VPN service that is making use of trackers should not be trusted.

We’ll conclude here today by saying that it is possible in some instances to get clarity on what a VPN’s tracker might be getting up to. A good example is the Exodus tool that is very useful for Android-specific information. Plus Apple is putting into place brand-new guidelines for App Store apps and making it mandatory that every single app disclose the information they are collecting, the permissions needed, and also what trackers are being used (if any). These are definitely steps in the right direction if people are in general going to become more trusting of these trackers.

7 Means for a Cooler PC

It’s best to be cool, and we’re not talking at all about ‘being’ anything at all outside of not to being too hot. Anyone who’s ever had no choice but to put a computer through its paces will know what it’s like to hear the cooling fan whirring furiously and failing to make much difference with a PC that’s running too hot, but there are things you can do to try to have yours be cool and have nothing to do with shades and a leather jacket.

CPUs aren’t the only component that get run a little ragged and start feeling the heat. There’s a whole lot of people who are harder on their GPUs than anything else, and if you’re an avid gamer you also probably don’t need to be told what it’s like to have a fan that sounds like it’s seize up and die if it’s made to go any harder.

Here at 4GoodHosting we’re like most good Canadian web hosting providers in that there’s but a few of us here who are real hardware experts. Most are more savvy on the soft side, but even if you’re not the type who could do open-heart surgery on a PC you can still make your PC less likely to overheat and perhaps shut down at the worst time possible.

That’s what we’ll share with our entry this week – 7 ways to keep your work or gaming station from overheating.

Clean out your PC

When we talk about cleaning here, we’re talking about it literally and not meaning getting rid of superfluous files and scripts and the like. Through those vents goes a lot of dust and dirt that ends up settling on the interior components of your PC. That can have negative effects depending on where the bulk of that crud builds up. Pet fur can be a part of it too.

So actually opening up the casing and using a fine brush and / or some isopropyl alcohol to clean up problem spots can be a good idea.

Make Sure Fans are Properly Installed

It doesn’t happen often, but sometimes cooling fans are facing the wrong way and when this happens they are 100% during more harm than good. A fans orientation will have the intake side taking cool air and the exhaust side of the blades dispersing hot air from the unit.

Have a look at yours if you’re having a problem with overheating. If the blades that are facing you curve away from you, then they’re they way they should be. If the blades facing you are curving towards you then somebody messed up assembling or re-assembling the computer and you’ve got a very simple but promising repair job to do.

You should also confirm the configuration works well for airflow. Try to aim for a slightly positive pressure setup, and it’s something you can find out more about online with a simple Google search.

Repaste the CPU Cooler

Having old thermal paste around your CPU cooler is pretty common, especially if yours is an older model and you’ve never had any reason or inclination to open it up and do basic maintenance. But if your computer is overheating now then maybe it’s time to give this a try and its also not particularly difficult.

Redoing thermal paste can improve a CPU’s temperatures and repasting can also fix wonky applications for a brand-new build too. All you need to do is buy a tube of Arctic Silver 5 or something similar. Carefully scrape away the remnants of the existing paste and apply a new coating of it. It’s possible to also repaste on a GPU but it’s more challenging than doing it on a CPU.

Add Additional Fans

If one is not getting it done for you, you can opt to add more fans to your CPU or GPU to cool it down more effectively. A good choice is to start with additional case fans, which tend to be the cheapest and usually not to much is required to work them into existing CPU housings. Many people choose to have two fans at the front.

Upgrade the CPU Cooler

Generally speaking, beefier 3rd-party model fans are going to perform better than the stock ones that came with your PC. Dropping your processor’s temperature is made easy by upgrading to a third-party CPU cooler much of the time.

Another option is to go with a closed-loop cooler, but it’s only really necessary when RAM clearance is an issue for you or you have looks considerations.

Go with a Mesh Front Case

Mesh-front cases are really popular right now, and switching yours out and going with one of these is also fairly basic. They look different and they work much better for ventilating against heat buildup. A cooling fan upgrade and a mesh front case may make a major difference in your ability to stop your desktop from overheating.

The last suggestion we’ll make is a minor one so it won’t get its own subhead. Most computers are perfectly fine with going dormant or to sleep when they’re not in use rather than being shut down. But shutting down at least somewhat regularly is better for general prevention of CPU and GPU heating. It’s a good idea to let yours go off completely every once in a while.

Quantum Computing Goes Desktop

To say much has been made of the potential of quantum computing would be a big understatement. We’re really only just scratching the surface of what its reach can be, and it holds so much promise for improving our lives along with major contributions to the efficiency of business. And nowadays quantum computing is coming on leaps and bounds. Previously the capacity of it meant that the physical hardware was expansive and the farthest thing from portable, but now that’s changed and having quantum computing go desktop is a big development.

This has the potential to reach into every industry and interest, and here at 4GoodHosting we are like any quality Canadian web hosting provider in that it has the potential to revise our landscape and those of the people who choose us to host their websites. Computing industry experts are calling an operating system available on a chip to be a ‘sensational breakthrough’

About 50 quantum computers have been built up to this point, and all of them use different software, as a quantum equivalent of Windows, IOS or Linux doesn’t exist. But this new development means an OS enabling the same quantum software to run on different quantum computing hardware types.

Let It Flow

The system has been named Deltaflow.OS and runs on a chip developed by consortium member SEEQC using almost nothing of the space required with previous hardware. The chip is about the size of a coin, and has all the same power and capacities as previous versions that were much larger. This new quantum computing chip is said to be about the size of a coin, and its relevance for the future of quantum computers is huge, especially as it look like they can be produced cost-effectively and at scale.

A little bit of explanation may be required here – quantum computers store information in the form of quantum bits, or ‘qubits’ as they are called. Qubits can exist in a pair of different information states at the same time. Being truly powerful requires scaling up to include many more qubits in order for it to make solving seriously challenging problems possible. Racks full of electronics were required to control Qubits previously, but now it’s all able to flow from a chip.

Grand Vision

The long-term goal is to have an operating system that makes quantum software portable across qubit technologies – scalable to millions of qubits. Part of that will be teasing the highest possible performance out of every qubit, and that will apply to applications like error correction that require fast feedback loops too.

The next question then is what will quantum computing be used for, and what are some specific benefit areas? A sufficient supply of qubits will allow quantum computers to process complex calculations at very high speeds, and so there is very real application for chemical testing without the use of a physical lab. As just one example.

What this entails is taking that vast processing power and using it to simulate digital versions of chemical compounds, test theories and predict chemical reactions without needing a physical lab and staff going through the processes of the tests. What this could do for the pharmaceutical industry is huge, especially when you consider it takes about $1 billion dollars to bring a major big-ticket new drug to market after many years of research, tests, and clinical trials. Quantum computing could speed this up and reduce research and development costs in a big way.

Better Batteries

If humans around the world are to achieve their carbon-neutral aims then the large scale switch to EV vehicles is going to mean the need for better batteries, and a lot of them. In much the same way the speed and reach of quantum computing can aid in drug development, the same virtual lab environment created by these computers may enable a much faster, less expensive, and more robust way to screen battery materials. Leading to improved research and development towards a cleaner future.

We can expect to see quantum developments in logistics, weather prediction, cybersecurity, and finance too. The technology will evolve in step with firmware developments for quantum processors that will later interface with Deltaflow.OS. There’s also something of a contest to see who will be first to transform quantum computers from experimental technology into commercial products. This is being referred to as the ‘quantum advantage’ and that’s a term you may be hearing a lot more of over the next little while.

Multi-Cloud Strategies to Dominate Future

Cloud computing is now nearly ubiquitous with its role in digital operations for all businesses, and what it’s done to replace the need for physical storage and a whole host of other benefits has obviously been a great development. It’s not often that a technological advance of this kind works so well when still in its infancy, but in fairness the cloud is something that was a natural outgrowth of many other pivotal gains in computing technology coming together and brining in other key components.

Evolution is always a part of these developments too, and when a new way of ‘doing thing’s is so eagerly adopted by so many people those evolutions tend to come fairly fast and furious. Here at 4GoodHosting we’re like every good Canadian web hosting provider in that we’re a little more front row than others for watching these types of major advances as they take shape, and some of us also have the web and computer savvy to have a better handle on what it all means and how it promises to add positive changes to the digital work world.

Being agile is an absolute necessity in the IT world, and especially considering the roadmap is always changing. Not to say that full one-80s are often required, but sometimes there is a need to plant your foot in the grass and pivot hard. Revisiting cloud infrastructure is always going to be a part of this, and more often than not it’s caused by workloads increasing massively in size overnight. That’s a term used loosely, and while it’s not exactly overnight the message is that big change requirements can come around surprisingly quickly and you need to be able to pivot and rechart without inflexibility.

At the forefront of all of this is a trend where multi-cloud strategies are seen as ideal fits for businesses, and that’s what we’ll look at here today.

Clearer Appeal

What is being seen now is multi-Cloud strategies emerging as a dominant part of many organizations’ long-term IT roadmaps. A recent study conducted by an IT services firm in the US and UK came back with some interesting findings regarding what’s to be in the near future for multi-cloud. The report paints a clear picture of the industry and makes clear how companies’ investments in cloud services are different than they were not even so long ago.

What was interesting to note first was the nearly half of respondents were indicating they have chosen a new provider in the last year, and this shows that shares of the cloud market are very much up for grabs between those major providers. What needs to be a primary focus for organisations right now is investing in the correct cloud strategy for their unique IT workloads.

Standard interests like pricing or vendor incentives, security and technical requirements are the top drivers when it comes to decision making related to cloud infrastructure, and it’s among these metrics that the bulk of decisions are made around what is going to serve the company best. Around 56% of respondents also indicated that security considerations will be impacting final decisions around choosing a provider in a big way.

42% Considering Multi-Cloud Strategy

So as we see that organizations are indeed moving toward multi-cloud strategies, it’s important not to overlook how private cloud environments hold onto their importance for organizations trying to make the best placement decisions when it comes to workload-from-cloud.

Here are key findings from the survey:

  • Microsoft Azure leads the way as the most popular public cloud provider (58%), and next are Google Cloud (41%), IBM (40%) and AWS (38%). Note that of the respondents only 1% stated having always been with the same cloud provider or platform.
  • Nearly half (46%) of respondents have gone with a new provider or platform within the last year — and more than 25% of them have made that move sometime in the past 6 months
  • Just 1% of respondents indicated having the same cloud provider or platform since their initial foray into cloud computing
  • 42% of respondents are pursuing a multi-cloud strategy
  • The vast majority of private cloud users (80% of them) stated better information security was their primary reason for going with a private cloud environment.
  • 89% of healthcare organizations and 81% of public sector interests foresee a continuing need for their private cloud over the next 5 years and beyond

Following all the major disruptions seen over the last year and some, the need for a modern technology stack is pushing cloud adoption hard in every industry. Capitalizing on the opportunities available in the market right now means cloud providers must meet these organization’s complex security and legacy workload needs.

Password Hygiene? It’s a Thing

You might not know it, but the word hygiene has Greek mythology roots. Hygieia was a daughter of Asclepius, who you probably also didn’t know was the Greek god of medicine. Hygieia was the goddess of health, cleanliness and sanitation, and so that pretty much makes sense in as far as where the word comes from. We all know how it’s important to brush our teeth everyday, but apparently it’s possible to be healthy, clean, and entirely sanitized with those digital gatekeepers we call passwords.

We’ve all seen password suggesters that give you an idea of how suitably strong your password is, but maybe far to many people are going with 54321 or something of the sort. Here at 4GoodHosting we’re like any other good Canadian web hosting provider in that we’ve come across all sorts of stories of password failures over the years and we try to make a point of giving customers some insights into good practices for the digital world if they need them.

And apparently the need is there. Passwords are still the primary form of authentication, but done poorly they can leave you vulnerable to attacks if your cybersecurity is not up to scratch. Passwords get stolen, and it’s happening a lot more often nowadays. They’re obtained by all sorts of underhanded means, and you may have some of yours that aren’t exclusively in your possession anymore too.

Billions Out There

At present there are billions of passwords available on the Dark Web, collected via various attack methods ranging from malware to phishing for them. Many are then used in password spraying and credential stuffing attacks.

The primary reason this is able to happen, according to web security experts, is that around 65% of users re use some of their passwords. That’s highly inadvisable, and if you do it then you put yourself at risk of stolen or compromised credentials. There’s another estimate that 1 in 5 companies who suffered a malicious data breach had it happen because of stolen or compromised credentials.

So what is poor password hygiene? It’s really any type of choice or omission with setting or sharing passwords that leaves doors wide open for attackers. If you’re the IT department with what you’ve got going on, your lack of knowledge about good password practices may be putting you at risk.

Put Effort into It

Choosing weak, easily guessable passwords like common keyboard patters or passwords that are obviously connected to an organization name, location or other common identifiers is where a lot of people mess up. Another common move is changing passwords only by adding sequential characters at the end. An example would be changing password1 to password.

A great example of this is what happened to the Marriot hotel chain. Just last year attackers obtained the login credentials of two Marriott employees and then compromised a reservation system and ultimately exposed payment information, names, mailing addresses, and much more for more than hundreds of millions of bonehead customers.

Why It Continues

Poor password hygiene is continuing to be a problem because it’s not visible enough as a problem or a potential threat. And thinking that attackers are only interested in targeting large organizations is incorrect too. Attackers do target SMBs and do it more often with the increasing frequency of online applications and remote technologies that can be compromised fairly easily a lot of the time.

The security of two-factor authentication is overrated and another common misconception for people. Two-factor authentication is a good security measure, but it’s certainly not fail safe. You still need your password to be as fully secure as possible.

And with Active Directory (AD), there is the belief that their password policy in AD is going to be sufficient. But it does not eliminate the use of compromised passwords or have anything to indicate the use of weak password construction patterns. You also shouldn’t think that implementing and enforcing a robust password security policy is going to create any degree of user friction.

Simplifying Password Security

Here are some fairly solid recommendations:

  • Choosing a password with a minimum length of 8 characters to encourage the use of longer passwords
  • Removing password expiration and complexity
  • Screening new passwords against a list of passwords known to be leaked / compromised

You also need to take risk level into account. Removing expiration guidelines can lead to a security gap given how long it takes organizations to identify a breach. It’s a good ideal to go with technical solutions that can reduce the poor password hygiene issues these can create.

Other good practices are:

  • Eliminating the use of common password construction patterns
  • Support user-oriented features such as passphrases (more memorable longer passwords) and length-based password. This also promotes less frequent password expiration because of how lengthy and strong the passwords
  • Continuously blocking use for leaked passwords
  • Making users able to reset their passwords with MFA (multi-factor authentication) from anywhere,
  • Work with existing settings you already use such as Group Policy

This is something that you want to be proactive about, and it’s really not asking too much of people to come up with a more solid and secure password. Go through what can happen if you have a weak password and you’ll know why that’s a headache you really want to avoid.

New Invention for More Climate-Friendly Data Servers

There’s no debating the fact that a higher population means higher power consumption. In the same way greater demands on data servers caused by so many more people using them indirectly are unavoidable too, and the way data centers are already using way too much power is definitely not good for the environment. Denmark isn’t anywhere close to being one of the most populated countries in Europe, but even there a single large data center consumes the equivalent of four percent of Denmark’s total electricity consumption.

That’s according to the Danish Council on Climate Change, and when you consider what that means you can imagine what the number would be like for much more heavily populated countries around the world. The growth of IoT and SaaS applications is going to increase this consumption and in the big picture it’s really not good. Here at 4GoodHosting, the nature of what we do as a quality Canadian web hosting provider means we can 100% relate to anything connected to operating large-scale data centers.

Plus, we prefer good to news to any news that is not as good and that’s why this particular piece of news out of Denmark made the cut for our blog entry this week. Let’s get into it, and it might make us all feel a little more at ease about our own small but significant contributions to power usage.

A+ Algorithm

What’s behind all of this is a new algorithm developed by Danish researchers that’s able to promote major reductions with the world’s computer servers and their resource consumption. We need to keep in mind that computer servers are every bit as taxing on the climate as all global airline traffic combined, and that is why green transitions in IT are so important.

So why exactly is this such an issue?

The world’s increasing internet usage has a very real and profound impact on climate due to the immense amount of electricity that’s demanded by computer servers. Current CO2 emissions from data centers are very high, and unfortunately they are expected to double within just a few years. Studies have indicated global data centers consume more than 400 terawatt-hours of electricity each year, accounting for approximately 2% of the world’s total greenhouse gas emissions

The person who gets the very real credit for developing this algorithm is Professor Mikkel Thorup and his team of researchers. They previously found a way to streamline computer server workflows and it resulted in considerable saved energy and resources. It was promptly incorporated by Google and Vimeo, among other tech giants.

Vimeo in particular stated that using this algorithm had cut back their bandwidth usage by an eight factor.

So what this team has done is built on that algorithm, and the reason it is noteworthy is because they have built on it in a big way. It is now capable of addressing a fundamental problem in computer systems and specifically with the way some servers become overloaded while other ones have capacity remaining.

Overload Stopper

The consensus is that this version of the algorithm is many times better and reduces resource usage as much as possible. And what’s also hugely beneficial is it being made available freely to whoever would like to make use of it. With worldwide internet traffic continuing to soar, the new algorithm developed by Thorup and his team addresses the problem of servers becoming overloaded with more client requests than they are able to handle.

Look no further than how many people are streaming content through Netflix or something similar every night. When this happens, systems commonly require a shifting of clients to make it so that servers have balanced distribution. It’s challenging to do, as often up to a billion servers are part of one individual system.

What results usually is congestion and server breakdowns, and with all the congestion and overload of requests comes a LOT of power and other resource consumption. Internet traffic is projected to triple between 2017 and 2022, and as it continues to increase, the problem will continue to grow but the algorithm provides the perfect scalable solution.

Minimal Movement

To this point these types of fixes have always involved a lot of different steps, but this new algorithm isn’t like that and that’s another reason why it’s being heralded the way it is.  It ensures that clients are distributed as evenly as possible among servers by moving them around as little as possible, and by making content retrieval as local as possible.

Ensuring client distribution among servers is balanced so that no server is more than 10% excessively burdened than others would have meant previous fixes dealing with an update by moving a client 100 times or more. Now that can be done in about 10 moves or less, and if the number of clients and servers in the system is a billion or more.

 

 

Stats Indicating Website Load Time Importance

In last week’s entry here we touched on ‘bounce’ rates, and how having higher ones can be hugely problematic for any website that’s serving an e-commerce function. If you’ve got a basic WordPress site for your blog on a basic shared web hosting package then you won’t be particularly concerned if those who enter choose to exit shortly thereafter. If your site is the avenue by which you sell products and services and make a livelihood for yourself, it’s going to be a much bigger issue if this becomes a trend.

Bounce rates are something that all conscientious web masters are going to monitor, even the ones who aren’t much of a master at all. We’re like any good Canadian web hosting provider in that we make these analytics resources available to people through our control panel, and most other providers will do the same so that it’s made easier to see if there’s some deficiency to your website that’s causing visitors to leave far too soon.

We found an interesting resource that puts the importance of website load times in real perspective for people who might not get the magnitude of it otherwise, and we thought we should share that here today.

2 or 3 at Most

The general consensus is that a website should load in less than 2 to 3 seconds, and more simply your website should load as fast as possible. Anything more than this time frame and your risk losing potential customers. There was a study done by Google in 2017 that indicated that as page load time goes from 1 second to 3, the likelihood of that visitor ‘bouncing’ increases by 32%.

It’s very possible that these numbers will have increased by this point 4 years later. The reality is the longer your page takes to load, the more likely it is users will find that to be unacceptable.

There’s no getting around the fact a fast user experience is extremely important. And even more so with the increasing predominance of mobile browsing. An analysis of 900,000 landing pages across several countries and industry sectors found that the majority of mobile sites are too slow. Apparently the average time needed for a mobile landing page to load is 22 seconds, and when you keep in mind that 2 to 3 seconds at-most guideline this shows that there’s probably a whole lot of bouncing going on.

The same study found that on average it took 10.3 seconds to fully load a webpage on desktop and 27.3 seconds on mobile. We can pair this with stats like the one that indicates mobile minutes account for 79% of online time in the United States, and it’s well known that desktop conversion rates are higher.

Plus:

  • The average time it takes for a desktop webpage to load is 10.3 seconds
  • The average web page takes 87.84% longer to load on mobile than desktop
  • 53% of mobile site visitors will leave a page if it takes longer than three seconds to load
  • Bounce rates are highest for mobile shoppers, in comparison to tablet and desktop users

Load Time & Relation to Customer Behavior

It’s commonly understood that page speed affects user experience, but exactly how detrimental that can be may still beyond some people’s understanding. The load time of a website impacts consumer behaviour directly. A page that takes longer to load will mean a higher bounce rate. Those running an online business should understand that a slow load time can result in a higher bounce rate, a general loss of traffic, and a significant loss in conversions.

Keep this in mind as well; The average attention span for a Generation Z person is just 8 seconds, and the average Millennial attention span is only slightly better than that. Most of these people won’t be inclined to wait for a page to load, and more often than not they will search for a new page.

Here’s more in a quick reference list on what may happen when your load time is too slow:

  • Approximately 70% of people say that the speed of a page affects their willingness to buy from an online retailer
  • Even a point 1 (0.1%) change in page load time can impact user journey and continued progression through a site
  • The number 1 reason in the U.S. why consumers choose to bounce from a mobile web page is excessive latency
  • When eCommerce sites load slower than expected, more than 45% of visitors will be less likely to proceed with purchasing
  • Even just a 2-second delay in page speed can push bounce rates up by more than double

Load Times & Relation to SEO

You shouldn’t need to be told that good SEO and higher SERPs are of primary importance all the time for any eCommerce website. It’s been more than a decade now since Google include page speed as a ranking factor for desktop, and it did the same for mobile-first indexing three years ago in 2018. Long story short, your website is going to be ranked based on mobile presence, not desktop.

This means that if your site is overly slow on mobile, your SERPs may take a hit. And while Google deemed website load time a ranking signal, marketers actually aren’t entirely convinced. One study of 11.8 million Google search results indicated that there was no correlation between speed and first page Google rankings.

Google’s ‘Speed Update’ tells a different story as the update only affect pages that deliver the slowest experience to users. However, it’s true that site speed also affects other ranking factors like bounce rates and total time on-site. The estimate is that the average page loading speed for the first page result on Google is 1.65 seconds. Long story short again, if you have an extremely slow webpage you are not going to make it onto the first page of Google. You may not even make into the 2 or 3 page

And considering that prospective customers aren’t inclined to dig very deep when finding what they’re looking for, that’s a genuine big deal. We’ll wrap up today by suggesting you consider this:

In last week’s entry here we touched on ‘bounce’ rates, and how having higher ones can be hugely problematic for any website that’s serving an e-commerce function. If you’ve got a basic WordPress site for your blog on a basic shared web hosting package then you won’t be particularly concerned if those who enter choose to exit shortly thereafter. If your site is the avenue by which you sell products and services and make a livelihood for yourself, it’s going to be a much bigger issue if this becomes a trend.

Bounce rates are something that all conscientious web masters are going to monitor, even the ones who aren’t much of a master at all. We’re like any good Canadian web hosting provider in that we make these analytics resources available to people through our control panel, and most other providers will do the same so that it’s made easier to see if there’s some deficiency to your website that’s causing visitors to leave far too soon.

We found an interesting resource that puts the importance of website load times in real perspective for people who might not get the magnitude of it otherwise, and we thought we should share that here today.

2 or 3 at Most

The general consensus is that a website should load in less than 2 to 3 seconds, and more simply your website should load as fast as possible. Anything more than this time frame and your risk losing potential customers. There was a study done by Google in 2017 that indicated that as page load time goes from 1 second to 3, the likelihood of that visitor ‘bouncing’ increases by 32%.

It’s very possible that these numbers will have increased by this point 4 years later. The reality is the longer your page takes to load, the more likely it is users will find that to be unacceptable.

There’s no getting around the fact a fast user experience is extremely important. And even more so with the increasing predominance of mobile browsing. An analysis of 900,000 landing pages across several countries and industry sectors found that the majority of mobile sites are too slow. Apparently the average time needed for a mobile landing page to load is 22 seconds, and when you keep in mind that 2 to 3 seconds at-most guideline this shows that there’s probably a whole lot of bouncing going on.

The same study found that on average it took 10.3 seconds to fully load a webpage on desktop and 27.3 seconds on mobile. We can pair this with stats like the one that indicates mobile minutes account for 79% of online time in the United States, and it’s well known that desktop conversion rates are higher.

Plus:

  • The average time it takes for a desktop webpage to load is 10.3 seconds
  • The average web page takes 87.84% longer to load on mobile than desktop
  • 53% of mobile site visitors will leave a page if it takes longer than three seconds to load
  • Bounce rates are highest for mobile shoppers, in comparison to tablet and desktop users

Load Time & Relation to Customer Behavior

It’s commonly understood that page speed affects user experience, but exactly how detrimental that can be may still beyond some people’s understanding. The load time of a website impacts consumer behaviour directly. A page that takes longer to load will mean a higher bounce rate. Those running an online business should understand that a slow load time can result in a higher bounce rate, a general loss of traffic, and a significant loss in conversions.

Keep this in mind as well; The average attention span for a Generation Z person is just 8 seconds, and the average Millennial attention span is only slightly better than that. Most of these people won’t be inclined to wait for a page to load, and more often than not they will search for a new page.

Here’s more in a quick reference list on what may happen when your load time is too slow:

  • Approximately 70% of people say that the speed of a page affects their willingness to buy from an online retailer
  • Even a point 1 (0.1%) change in page load time can impact user journey and continued progression through a site
  • The number 1 reason in the U.S. why consumers choose to bounce from a mobile web page is excessive latency
  • When eCommerce sites load slower than expected, more than 45% of visitors will be less likely to proceed with purchasing
  • Even just a 2-second delay in page speed can push bounce rates up by more than double

Load Times & Relation to SEO

You shouldn’t need to be told that good SEO and higher SERPs are of primary importance all the time for any eCommerce website. It’s been more than a decade now since Google include page speed as a ranking factor for desktop, and it did the same for mobile-first indexing three years ago in 2018. Long story short, your website is going to be ranked based on mobile presence, not desktop.

This means that if your site is overly slow on mobile, your SERPs may take a hit. And while Google deemed website load time a ranking signal, marketers actually aren’t entirely convinced. One study of 11.8 million Google search results indicated that there was no correlation between speed and first page Google rankings.

Google’s ‘Speed Update’ tells a different story as the update only affect pages that deliver the slowest experience to users. However, it’s true that site speed also affects other ranking factors like bounce rates and total time on-site. The estimate is that the average page loading speed for the first page result on Google is 1.65 seconds. Long story short again, if you have an extremely slow webpage you are not going to make it onto the first page of Google. You may not even make into the 2 or 3 page

And considering that prospective customers aren’t inclined to dig very deep when finding what they’re looking for, that’s a genuine big deal. We’ll wrap up today by suggesting you consider this:

  • Users spend more time on websites that load faster and the number of pages those users view decreases as page speed slows down
  • Sites with an average page load time of 2 seconds had customers view an average of 8.9 pages, compared to those with a load time of 7 seconds where users viewed only 3.7 pages

Better Website Navigation for E-Commerce Websites

Anyone who has a website serving as the primary point of contact between their goods and / or services and paying customers is probably going to want that site functioning as optimally as possible. As it relates to sales and incoming revenue, that’s going to be even more important if it’s an e-commerce website and you are as profit oriented as the next guy or gal. Most people have a lot invested in their business, and $ is only a part of that investment. People will want to get the maximum return on that investment, and good website navigation is definitely a factor.

We can relate to all of this here at 4GoodHosting, as being a quality Canadian web hosting provider we’re equally interested in returns on investments and we tend to have something of an affinity for anything digital. The fact you’re reading this means you’ve visited our website at least once, and we’ve put the same priority on solid website navigation that many others have to ensure we have as many new web hosting in Canada customers signing up with as possible.

Yes, the lowest prices on reliable web hosting in Canada do the lion’s share of the work there for bringing new customers into the fold, but the design of the site is a factor just like it is for any e-commerce website where you want to be retaining customers and making sure as few as possible become ‘bounce rate’ statistics.

Orders Up, Bouncers Down

So we’ll refer to those people who leave a website shortly after arriving as ‘bouncers’ then, with absolutely no relation to the huge man who’ll throw you out of the club if you get out of line in there. These people usually bounce because the website is a) visually unappealing to them to the point it suggest a lack of professionalism for the business, or b) the way they’re able to move through the site’s pages isn’t what they like.

Fortunately very little of the ‘like’ part of the equation has little to do with personal preferences or anything else of the sort. It has more to do with their inherent belief as to the way an e-commerce site should work when it comes to entering, looking over products or services, and then proceeding to buy or order them. Also good news is that for the vast majority of people their preferences and expectations are fairly similar in as far as site navigation is concerned.

After all, if there wasn’t such widespread agreement on this we wouldn’t be able to share these tips. Here they are:

Go with Slim Menus

Everything is important about your website has a direct connection to the site’s navigation. Choosing to try to fit it all into a single area can have very negative effects. A general guideline that’s good to stick by it that you shouldn’t have more than seven menu items in your navigation scheme. A lean menu is one that’s more conducive to being able to focus and move quickly while a menu loaded with options can put people off without them even being entirely aware of it.

The navigation that works best with menus is one that shows your main services or products and is descriptive enough but is still concise overall.

Descriptive Menu Items are Best

Google and other search engines crawl your site if it’s on the web, and when they do your descriptive menu items will be indexed. Those only using a general or generic term will find their site is lumped into that mix. For that reason it’s better to create terms that are more specific to your product or message. They will index more effectively and drive more of the right type of traffic to your website. The simple fact is products and services that are too general are going to apply equally to many, many types of businesses. Making you navigation terms descriptive will reduce bad clicks and bounce rates.

Be Wary of Dropdown Menus

Nesting all of your categories in a dropdown menu may mean your visitor doesn’t mouse over it and bounces shortly thereafter because they didn’t see what they wanted offered speedily enough. A simple navigation menu with descriptive terms will direct your user to a page where you can present more sub items. You can then design these pages to engage and convert the visitor rather than losing them early on because they weren’t inclined to put their eyes through their paces looking all up and down a menu.

Order is Important

A navigation basic follows the belief that the first and last items that appear are going to be the most effective. Whether a visitor’s attention is gathered and whether they’re retained long enough to get to check out is more important for items that appear at the beginning and at the end of your menu. Your most important (and most frequently ordered) items need to be at the beginning and end of the menu and the least frequently ordered items can take up the middle.

Search Bar Location Significance

When a menu fails to engage and your dropdowns are overlooked it’s going to be natural that a user will head for the search bar. Sometimes it’s the last chance you have before someone becomes a bouncer so it’s very important to have your search function bar readily visible on the home page of the website.

Search can be a highly valuable item especially on a site. E-commerce experts will tell you your search bar should be near the top of every page on your website too, not just the home page.

Content and Social Media

It’s true that a blog and social media links can be very beneficial overall for conversion rates. Engaging your audience works to build lasting relationships and eventually become a key to continued business and traffic growth. Visitors and shoppers who find your social media the perfect mix of appealing and engaging will be even more pleased to find it’s paired with quality web content when they visit your site. In such a scenario they are more likely to become loyal to you, be return customers and – perhaps most importantly – refer others to you.

Links to these areas need to be seen as an integral part of the site’s navigation, and overall it needs to be sharp to create a very inviting environment to go along with all the effort you’re likely already putting into SEO to drive inbound traffic. Check your analytics before and then again after making navigation adjustments to your site. Small changes can mean big differences.