Benefits of Website Isolation for Shared Hosting

There have been plenty of entries here over the years where we’ve talked about the many benefits of VPS web hosting, and for those who truly need to have a private server it’s not even a consideration to go with a more affordable shared web hosting package. Websites that need elbow room to do what they do when pages load are often a big part of a businesses online presence, so while it would be nice to have a less expensive web hosting arrangement it’s just not possible to do with anything less than VPS hosting.

With that understood, anyone with a website that doesn’t need to be accommodated to such an extent can and will be fine with shared web hosting most of the time. And there are smaller and less dynamic e-commerce websites that have been on shared hosting plans for years. There is more in the way of valid concerns around the security and integrity of websites hosted in shared environments though, and that’s always going to be the case. Website isolation can be a partial fix for that though, and that’s what we’re going to go over here with this week’s entry.

It’s a topic that is going to be noteworthy for any Canadian web hosting provider, and here at 4GoodHosting we are the same in the way we put an emphasis on providing valuable information for anyone who has a website for their business, venture, or personal interest and want to be proactive in ensuring that everything stays uneventful with their site and that it’s always problem free alongside the information superhighway.

Isolation = Better Security

Businesses, bloggers, and web developers often rely on shared hosting environments based on their affordability and the way general management is less complicated. But security is nearly always a challenge in these environments. Website isolation can be part of a solution to reducing risks, and there are 5 primary advantages for putting website isolation in place.

Better Overall Security – Being more vulnerable to security breaches is natural with shared hosting when multiple websites share the same server resources. Implementing isolation techniques helps contain the impact of a security breach, making it more likely that a compromised website does not affect others on the same server. Maintaining the overall security and integrity of all websites hosted on the shared server is always going to be important for businesses and website owners alike.

Effective Resource Management – Shared hosting environments will have all websites sharing the server’s resources, with CPU, memory, and bandwidth split between them. Maintaining the stability of the entire server is dependent on each website operating independently. Proper resource allocation can prevent resource-intensive websites from causing performance issues or crashes that may well affect other websites on the server.

Safeguarding Privacy and Data – Users are only going to fully trust in the protection of their privacy if blocking unauthorized access to sensitive data like user information and database content can be relied on based on the web host’s infrastructure and measures in place. Techniques that restrict access to authorized users only play a key role in safeguarding the integrity of websites on shared servers, and this is particularly important for sites that handle sensitive customer data or payment information.

Regulation Compliance – Many industries have General Data Protection Regulation (GDPR) and the Health Insurance Portability and Accountability Act (HIPAA) as a requirement. Having isolation measures in place often goes a long way for meeting these regulations and their requirements as it helps block unauthorized access to sensitive data.

Protecting Website Performance – Having consistent performance levels for all websites hosted on a shared server can be a challenge, but it’s a necessity if a host is going to legitimately offer it to web hosting customers. Techniques that ensure each website operates independently can help prevent resource-intensive websites from monopolizing server resources, leading to improved performance across all hosted websites.

Isolation techniques for shared hosting websites do much for maintaining security, privacy, and performance in shared hosting environments. They allow for better performance metrics with the way that each website and user account operates independently. This contributes to protecting the integrity of websites on shared servers.

Anyone considering a shared web hosting provider will do well to look into any isolation features they offer, and it’s good to ask about what measures they have in place to guarantee the safety and integrity of your online presence.

Digging into Deep Learning

Anything in life that requires learning in order to have functional ability is going to also have a hands-on experiential component if you’re truly going to become adept at whatever it is. There is value in learning how to do it, but usually the most integral part of the learning comes from actually being at ground level and doing it. These days AI is the buzzword connected to nearly everything in the digital world, and that world is where an increasingly large part of everything we have and make use of has its roots.

We’ve talked about machine learning before, and it’s a concept that’s fairly well understood within the parameters of how advanced computing has made advances in leaps and bound over recent years. As the saying goes, you certainly can’t stop progress and the type of deep learning that people are talking about with relation to supercomputers and the like builds on machine learning and improves on it with a whole lot of experiential learning.

It is a profound subject, and more so for anyone or a business where much of what you do is being constantly reshaped by computing technology. That certainly applies to us here at 4GoodHosting in the same way it would for any reputable Canadian web hosting provider, and as AI continues its meteoric rise to prominence and increasing dominance in the space it makes sense for us to have a look at Deep Learning and how it contributes to influentially to supercomputing.

Data Abundance

Deep learning is an advanced artificial intelligence technique who’s usefulness is in the way it has abundant data and increased computing power. Online language translation, automated face-tagging in social media, smart replies in your email, and the new wave of generative models are among the many reasons it’s main technology behind many of the applications we use every day.

Everyone will have heard of all the buzz around the ChatGPT AI-powered chatbot and the way it is revolutionizing pretty much everything, and part of its strength is in the it can differentiate images from text descriptions, exampling how deep-learning systems that model the relation between images and text descriptions have massive potential for ‘smarter’ and more intuitive computing.

As mentioned, deep learning is a subset of machine learning but of the newer types where classic, rule-based AI systems, machine learning algorithms that use training to develop their behavior by processing annotated examples is much less relevant now. Instead, that relevance has now shifted to deep learning and neural networks.

Classic machine-learning algorithms that solve many problems that rule-based programs don’t do well with dealing with soft data such as video, images, sound files, and unstructured text. The best example of this is with predictive modelling, where the contributions domain experts, computer programmers, and mathematicians aren’t need to anywhere near the same extent anymore. What takes the place of all that is an artificial neural network.

Human Brain Inspiration

Deep-learning algorithms tackle the same queries and challenges using deep neural networks. It is a type of software architecture that actually takes a lot of its framework from the human brain. Neural networks are layers upon layers of variables, and they adjust themselves based on what they can recognize in the properties of the data they are trained to work with. This makes them capable of doing tasks like classifying images and converting speech to text.

Neural networks are especially capable with going over unstructured data and finding common patterns in it, and in recent years the greater availability and affordability of storage, data, and computing resources have solidified neural networks as one of the pinnacles of AI innovation.


Computer Vision

The science of having software make sense of content with images and video is referred to as computer vision. Deep learning has made extensive progress here, and one of the best use-case examples is with cancer detection. This type of deep learning is also well-established in many of the applications people use  every day. Take Apple’s Face ID, for example – it uses computer vision to recognize your face, the same way Google Photos does for various features like searching for objects and scenes as well as for image correction.

Voice and Speech Recognition

Most people have taken advantage of Siri or Alexa at some point, and deep-learning algorithms are very much at work here too as they play a role in how your voice is converted into text commands. Google’s keyboard app is Gboard, and it uses deep learning to deliver on-device, real-time speech transcription that types as you speak.

Natural Language Processing and Generation

Natural language processing (NLP) is the science of extracting the meaning from unstructured text, and developers of classic software have been looking for a workable fix to that challenge for decades. It’s not possible to define ALL the different nuances and hidden meanings of written language with computer rules, but neural networks trained on large bodies of text can execute many NLP tasks with accurate results.

Google translate is another popular resource for people, and it also experienced a boost in performance when Deep Learning was incorporated into it. Smart speakers use deep-learning NLP to understand the various nuances contained in spoken commands, and basic examples of this can be when asking for weather forecasts or directions.

Benefits of Domain Privacy for Business Websites

It’s fortunate that these days greater number of people or businesses who are online are more aware of the need to be smart about when and where you make phone number and email addresses available. Phishing scams are likely if you’re negligent about where your contact details are accessible, and in worst case scenarios identity theft or even having your website hijacked can happen.

Anyone who knows anything about acting will be familiar with what a stand-in does for an actor while on set, and they’re very valuable that way. The way domain privacy works is kind of like that it is the actor’s stand-in that it’s a service that replaces the individual’s contact information with that of the service provider. Doing this secures you against anyone who might be trying to access your details with the aim of initiating fraud or malevolent practice.

Business owners investing in expensive websites will want to consider getting the domain privacy add-on when building their website or having it built for them. Keep in mind as well that contact information doesn’t need to be publicly displayed on your website to have it still be available. The aim needs to be with making sure it doesn’t end up in WHOIS records and this is a way to protect you from spambots that exclusively check WHOIS data.

This option is going to make a lot of sense for man individuals or businesses, and the growing need for expanding web security practices makes domain privacy a topic that any good Canadian web hosting provider is going to take an interest in and that’s true for us here at 4GoodHosting too. So this is what we’ll look at with our blog entry this week.

Who’s Name?

The way domain privacy functions starts with how domain names are registered. That’s something we are explicitly familiar with and most internet domain names found in use on the Internet are registered through an organization called ICANN – Internet Corporation for Assigned Names and Numbers. Domain owners must provide their name and contact information when purchasing a new domain name and the information is then listed on a database that has the name WHOIS.

Once it is catalogued in the directory it becomes accessible to anyone on the internet via the many free domain lookup tools. Anyone with an internet connection can find one easily and use it. Domain privacy makes it possible to hide your contact information from the WHOIS directory and make a random email address and the registrar’s own contact details available instead of your own.

We should as well that there are legitimate reasons why ICANN makes it a requirement that every website owner put their contact details on display publicly. One of the bigger ones i the way it makes it easier for law enforcement agencies to track people in case there’s illegal activity on their websites. Helping lawyers hold website owners accountable in cases involving plagiarism and copyright infringement is important too.

It also makes it possible for people who are interested in buying your domain to contact, although those aren’t always harmless communications either. But the bad of it can be with people trying to send you marketing spam over email or phone. It is true that hackers have used contact information to hijack websites, and putting site addressees in line to be targets you for phishing scams happens very regularly.

Look to EU’s GDPR

Hackers and spammers often have dedicated bots at work crawling through WHOIS directories in order to generate lists of contact details and then pick and choose the ones where they think they see the most promise with whatever it is they’re aiming to do. When contact details don’t show up on the WHOIS records it’s much more likely a business will evade the attention of such systems and steer themselves clear of any of these types of problems.

Here in North America we might want to look at what has happened with the European Union’s General Data Protection Regulation and how it relates to better protections for businesses that need to make contact information available. This new set of data regulations aimed at protecting your internet privacy is going far with this and it’s creating legislation resulting in everyone protected under EU law will see their contact details redacted in WHOIS listings across the internet.

Apparently seeing ‘redacted for privacy’ rather than the actual contact information is what people will see. It may be wise for policy makers here in North America to do the same thing to offer people those same types of assurances when they’re doing business online.

Reddit Now Looking to get Paid for Role in A.I. Development

Gleaning insight from conversation with others is what has made Reddit the hit it has been in the digital space, and if you have any area of interest you can always find at least one sub-Reddit on it, and usually you’ll have more than a few to look through. It’s one of the most engaging social networking sites if you like to ingest your information by consuming text and the simplicity of it is really nice too. 57 million people visit Reddit every day to chat and broaden their horizons with whatever subject it is they’re interest in, and it’s a great resource for that.

Looking at it from a different angle, you might be surprised to learn that Reddit chats have also served as a free teaching aid for companies like Google, Microsoft and – most notably these days – OpenAI. Reddit conversations have been used in the development of giant artificial intelligence systems created by these 3 and we’re seeing how they’re already becoming such a big deal in the tech industry.

Definitely a development that everyone in our industry will take note of as well given the connection, and here at 4GoodHosting we’re like any other quality Canadian web hosting provider in that seeing an online chat forum become an open-source type of asset for these big players in the digital world is definitely something we’ll be keen to follow as well as share here with our blog.

Charging for Access

And it is definitely interesting to see that Reddit wants to be paid for accessing its application programming interface now. That’s the method through which outside entities can download and process the social network’s vast selection of person-to-person conversations, and the heads of the company at Reddit don’t think they should be making that value available for free.

So what we’ll see now is Reddit charging for access to the conversations it hosts for the purpose of developing A.I. systems like ChatGPT, OpenAI’s popular program. Reddit is also acting as it prepares for a possible initial public offering on Wall Street this year. As of now it makes most of its money through advertising and e-commerce transactions on its platform, but right now the discussion is ongoing around what they will be charging for A.P.I. access.

The New Bots

Here are the other chatbots powered by artificial intelligence that have been utilizing Reddit as a development resource, and here they are:

ChatGPT – the MUCH talked about artificial intelligence language model from a research lab, OpenAI. Able to respond to complex questions, write poetry, generate code, translate languages or even plan vacations. GPT-4 introduced in mid-March is even able to respond to images

Bing – Microsoft has a similar chatbot, capable of having open-ended text conversations based around virtually any topic but apparently does have occasionally inaccurate, misleading and weird responses that put it in a bit of a negative light in comparison when it first came out.

Bard – This is Google’s Chatbot and originally conceived as a creative tool designed to draft emails and poems. Notable for it is the way it is able to generate ideas, answer questions with facts or opinions, or write blog posts entirely on its own.

Ernie – Here is the lesser light one for sure, but Chinese search giant Baidu came out with their rival to ChatGPT in March. The name is a shortened version for Enhanced Representation through Knowledge Integration, but it hasn’t made anything near the splash that these other three have.


L.L.M.s are sophisticated algorithms companies like Google and OpenAI have developed, and the algorithms are able to take Reddit conversations are data, and then adding that data to the vast pool of material being fed into the L.L.M.s. to develop them.

Other types of companies are seeing value in Reddit conversations and also for what they have with images. Most people will know Shutterstock, an image hosting service. It sold image data to OpenAI to help create the A.I. program that creates vivid graphical imagery with only a text-based prompt required – DALL-E.

Artificial intelligence makers need two significant things to ensure their models continue to improve. The first is a staggeringly large amount of computing power, and the second is an enormous amount of data. Many of these A.I. developer major players have plenty of computing power but still need to go outside their own networks for the data needed to improve their algorithms.

Other sources like Reddit that are utilized are Wikipedia, millions of digitized books, and academic articles, and Reddit has had a long-standing symbiotic relationship with the search engines of companies like Google and Microsoft. Their search engines have been crawling Reddit’s web pages in order to index information and make it available for search results.

With LLMs though the dynamic is different as they obtain as much data as they can to create new A.I. systems like the chatbots. Reddit’s A.P.I. is  still be going to be free to developers who wanted to build applications that helped people use Reddit and there’s also an aim to incorporate more so-called machine learning into how the site operates. One possible benefit of that is that it might identify the use of A.I.-generated text on Reddit and labelling it as that.

Leveraging Private Edge Computing and Networking Services for Better Scaled VPNs

Fairly common for data storage and management needs to have outgrown what you originally set up for their accommodation and giving them the elbow room they need. All sorts of possibilities and variations on and around what that outgrowing might mean for a person or organization depending on what they do and how extensive their data needs have become. For a long time now the default suggestion for any in such a situation would be to move to a Virtual Private Network (VPN).

But here we are again collectively struggling to keep up with changing needs and realities, and if we were to list out all the potential explanations as to why a VPN wouldn’t be quite cutting it like it used to for people then we’d have an entire blog entry of its own. But VPNs are so well entrenched as a critical enabling tool for today’s distributed organizations and internet users. Roughly one-third of all internet users now use a VPN to protect personal data, and that’s a number that’s going to get the attention of us here at 4GoodHosting in the same way it would for any good Canadian web hosting provider.

Then there’s the fact that there’s plenty ready to push this trend even further, especially with  rampant cybercrime and privacy concerns likely to be front and center in the coming years. The pressure this puts on VPN providers is to offer reliable ways for this surging demand to be quickly, efficiently, and cost-effectively accommodated. And the need is even more acute in high-growth emerging markets which offer massive growth potential – Indonesia, China, Thailand, India, and the UAE to name the most notable ones.

The most recent and popular industry consensus is that the best way to do this is to leverage private edge computing and networking services as a means of scaling VPNs more ideally, and that’s what we’re going to look at with this week’s blog entry.

Difficult, but Doable

Let’s start with what makes this difficult. Heavy regulatory barriers, lacking infrastructure, gaps in connectivity, and expensive operating costs means reaching customers in these markets can prove to be challenging. The entirety of scaling a VPN service is difficult too, and much of that is because until now there’s only really been two approaches to doing that – either horizontally or vertically.

When you scale up vertically it is almost always necessary to upgrade servers by replacing them. Expensive? Absolutely, and prohibitively so for a lot of the organizations that would need to eat those costs. But having optimal performance per server is a must, and so if you’re going to scale up vertically these high hardware replacement costs are pretty much unavoidable.

Scaling out horizontally presents its own set of reasons for decision makers to be dissuaded. Scaling out horizontally by adding more servers to your current infrastructure to accommodate peak user loads is expensive and time consuming. Putting together a private high-performing global network that is capable of spanning geographical distances can seem like a daunting task with how long it will take and how much it will likely cost. This is making no mention of the additional maintenance costs which add to the expensiveness.

Private Edge Solution

Having infrastructure providers that offer global, private edge computing and networking services is what’s needed, but who has the means of stepping up and doing what’s necessary to make that it available for those who need it. Another option exists for VPN providers that don’t find cost efficiencies in scaling horizontally or vertically.

That’s to work with a 3rd-party infrastructure enabler that has private, high-quality compute and networking services at the edge of the network available. The key part of this being at the edge is the way it would be relatively close to end users in strategic markets. That eliminates the distance problem from the equation, and by outsourcing network and computer operations these providers can instantly scale into global markets and serve new VPN customers.

Immediate benefits:

  • Improved performance with more ensured performance and stability in overseas markets
  • Reduction in long distance data transmissions resulting in faster data transfers and much less in the way of performance issues (latency / jitter)
  • Better security stemming from 3rd-party infrastructure providers being able to grant access to premium bare metal and virtual machines (VM) for enhanced VPN security and scaling more safely
  • Less maintenance due to avoiding more-constricted VPN services where many servers spread out across multiple locations
  • Lower operating costs as by outsourcing operations you are able to leverage flexible pricing models and pay less for the bandwidth you need

Last but not least, aggregate bandwidth pricing makes it more possible for you to evaluate the balance between underutilized and overutilized servers. You are then able to reduce bandwidth waste and make the most of your bandwidth spend.

Minimizing Loss and Latency with Cloud-Based Apps

Be it right or wrong, being accommodating and understanding of something or someone only occurs if basic expectations are still being met. Most of you who’d be reading this blog in the first place will know what a bounce rate is, and even though we might not know it we all have an inner clock that dictates how long we’ll be willing to wait for a page to load.

Page loads and page speeds are different, though, but all of this just highlights what’s already well known in the digital world. There’s only so much waiting a person can be expected to so, and so this has lead to efforts to minimize loss and latency with cloud-based apps.

The success they’ve had with doing that is what we’ll talk about with our blog entry here this week. Cloud-based technology has been integral to how many of the newest apps have the impressive functionality they do, and even if you’re not the savviest person to it you are probably benefiting it in ways you’re not even aware of based on that mini PC or Mac masquerading as a ‘phone’ in your pocket.

Having so many developers catering to public cloud IaaS platforms like AWS and Azure, and PaaS and SaaS solutions too, is made possible by the simplicity of consuming the services. At least to some extent when you are able to connect securely over the public internet and start spinning up resources.

This is something that shows up on the horizons for good Canadian web hosting providers like us here at 4GoodHosting, as it’s definitely within our sphere.

So let’s have a look at what’s known with the best ways to minimize loss and latency with cloud-based apps.

VPN And Go

The default starting point for any challenge that needs to be addressed or choice that needs to be made is to use the internet to connect to the enterprise’s virtual private clouds (VPC) or their equivalent from company data centers, branches, or other clouds. And preferably with a VPN, but doing so doesn’t guarantee an absence of problems for modern applications that depend on lots of network communications among different services and microservices.

Quite often the people using those applications can run into problems with performance, and more often than not it’s related to latency and packet loss. That’s logical enough to make that connection, but there’s more to it. Specifically with their magnitude and variability. Loss and latency problems will be a bigger deal for internet links than across internal networks. Loss results in more retransmits for TCP applications or artifacts due to missing packets for UDP applications, and too much latency will mean slower response to requests.

If that’s the scenario and there are service or microservice call across the network then this is where loss and latency are most going to hamper performance and take away from user satisfaction in a big way. Values that might be tolerable when there’s only a handful of back-and-forths can become wholly intolerable when there are now exponential values of them given how much application architecture is in place.

Varying Experiences

More variability with latency (jitter) and packet loss on internet connections improves the chance that any given user gets a widely varying application experience. One that may be great, or absolutely terrible and everywhere in between. That unpredictability is as big an issue as the slow responses or glitchy video or audio for some users some of the time.

3 specific cloud-based resources come to the forefront as solutions to these problems; direct connection, exchanges, and cloud networking.

A dedicated connection to the cloud is the first one we’ll look at. This is where the customer’s private network is directly connected to the cloud provider’s network. This will usually involve placing a customer switch or router in a meet-me facility. The cloud service provider’s network-edge infrastructure then connects them with a cable so packets can travel directly from the client network to the cloud network. And there’s no need to traversing the Internet.

The only potential hangup is with WAN latency. But as long as the meet-me is acceptable, performance should be comparable to an inside-to-inside connection. If there’s a potential downside it’s probably with how direct connects are expensive compared to simple internet connectivity. They also tend to only come in large-denomination bandwidths only. Finding something smaller than 1Gbps is unlikely.

Multiple CSPs with Exchanges

Big pipes are always an advantage and that’s true for any type of context you can use the term. Content service providers (CSP) with big pipes are able to take large physical connections and separate them into smaller virtual connections at a broad range of bandwidths under 100Mbps. Making a single direct physical connection to the exchange is beneficial for the enterprise user, and any identified means of making a virtual direct connections over it to reach multiple CSPs through the exchange is preferable.

The next consideration here is for internet-based exchanges that maintain direct connects to CSPs, but still leave customers free to connect to the exchange over the internet. The provider typically offers more in the way of onloading locations plus a wide network of points of presence at its edge. This makes it so that customer traffic doesn’t need to be moving around the internet before making the important exit into the private network and without experiencing latency and loss.

Artificial Intelligence Now Able to Crack Most Passwords < 60 Seconds

There are some people who have more in the way of long-term memory ability than short-term memory, and while that may sound good it does come with its own set of problems. Ideally you have a balance of short and long-term memory, and that will be more beneficial if you’re the type who has a demon of a time remembering their passwords. But it’s never been a good idea to create simple passwords, and it’s even less of a good idea nowadays with the news that the rapid advances in AI recently mean that artificial intelligence is almost certainly going to be able to figure out those passwords.

The fact that most of us use password apps on our phones attests to two things. First, how many passwords we need to have given the ever more digital nature of our world. And second, just how many of us don’t have the memory to be able to remember them organically. So if you’re not good with memory but you’ve resisted putting one of these apps on your phone then you may want to now. This is a topic that will be of interest for us here at 4GoodHosting as like any good Canadian web hosting provider we can relate the proliferation of passwords we all have these days.

Some of you may be familiar with RockYou, and if you are you’ll know that it was a super popular widget found on MySpace and Facebook in the early years of social media. There’s a different connection there between the widget and where we’re going with AI being able to hack passwords in less than a minute, so let’s start there with our web hosting topic blog entry this week.

Password Mimicker

How it is part of the reason that now is a good time to update your password is this. Experts have found AI systems are able to crack almost all passwords easily, and that’s just one more example of how the capabilities of artificial intelligence are expanding in leaps and bounds these days. In 2009 RockYou was the victim of a big-time cyber attack and 32 million passwords that were stored in plaintext were leaked to the dark web.

From that dataset, the researchers used 15.6 million and fed them into PassGAN, where the passwords now often used to train AI tools. The significance of that is in how PassGAN is a password generator based on Generative Adversarial Network (GAN), and it creates fake passwords that mimic real ones found genuinely on the Web.

It has two neural networks. The first one is a generator, and the second on is a discriminator. The generator builds passwords which the discriminator before they are scanned and sent back to the generator. Both networks improve their results based on this constant back and forth interaction.

More than Half

Passwords shorter than 4 characters are not common and neither are ones longer than 18, so those were the minimum and maximum where before and beyond the password was excluded from consideration in the research. The findings were that 51% of passwords that could be considered common’ could be cracked in less than a minute by the AI. 65% of them were cracked in less than an hour.

More than 80% of them were able hold strong for over a month, but even this many passwords had been deciphered by AI within that time. The average for passwords with 7 characters was to have them AI-broken within six minutes, and even less if the password had any combination on 1-2-3 or 3-2-1 in it. Any other combination of numbers, upper- or lower-case characters or symbols made no difference in the relative strength of the password when squared up with AI.

Go 15 or Longer from Now On

The consensus now is that to have AI-proof passwords you should be creating ones that have 15 characters or more. researchers suggest people go for passwords with at least 15 characters, and with lower and upper-case letters, numbers, and symbols, being mandatory. Going with one that is unique as possible and updating / changing it regularly is recommended too, particularly considering that – like everything – AI is going to get better at this too.

Minimizing Loss and Latency for Cloud-Based Apps 

Network traffic is like the type of motor vehicle traffic where we most immediately connect the term. Build it and they will come, and the concept of induced demand really does work in exactly the same way. If space is created the fact it has been created means a demand will be created to fill it. That’s not so good when it comes to trying to build enough infrastructure to accommodate traffic, and servers are struggling in the same smaller-scale way with what it takes to accommodate more data traffic demands.

The advantages of cloud computing have compounded the problem with so many more users demanding cloud storage space, and increasingly there are apps that are cloud-based and require bandwidth access to the point that without it they won’t function properly. That’s bad news for app developers who want people using their app to not be impeded in any way. Performance of cloud-based apps that create lots of network traffic can be hurt by network loss and latency, and ways of best dealing with that is what we’ll look at this week here.

It’s a topic that will be of interest to any good Canadian web hosting provider, and that certainly applies to us here at 4GoodHosting. We have large data centers of our own too, but wouldn’t be able to accommodate even 1/1000th of the demand created by cloud storage at all times. There are ways to minimize loss and latency for cloud-based apps and SaaS resources and so let’s get onto that.

Mass Adoption

A big part of the popularity of choosing to adopt of public cloud IaaS platforms and PaaS and SaaS has come from the the simplicity of consuming the services. The means of connecting securely over public internet and then accessing and utilizing resources creates strong demands on infrastructure and there are big challenges associated with private communication between users and those resources.

Using an Internet VPN is always going to be the simplest solution if your aim is to connect to the enterprise’s virtual private clouds (VPC) or their equivalent from company data centers, branches, or other clouds. But there are problems that can come with relying on the internet when modern application depend heavily on extensive network communications. And it is also very common for people using those applications to be running into problems with performance because of latency and packet loss.

It is the magnitude and variability of this latency and packet loss that are the primary notable aspects here, and the issue is more acute when they are experienced in execution of internet links rather than across internal networks. Loss results in more retransmits for TCP applications or artifacts due to missing packets for UDP applications while slower responses to requests come with latency.

Occurrences of service or microservice calls across the network are opportunities where loss and latency can hurt performance. Hundreds of additional requests can be added to values with back-and-forths and they can quickly become unbearable when modern application architectures makes them explode in numbers based simply on how the operations go.

Need to Reduce Jitters

Latency is also referred to as jitters, and the greater variability that comes with latency for cloud apps is related to packet loss on internet connections. What this does is increase the chance that any given user gets a widely varying application experience that may be great, or it may be awful at the same time. That unpredictability is sometimes as big an issue for users as the slow responses or glitchy video or audio.

Dedicated connections to the Cloud are what needs to happen and the advantages with connecting a customer’s private network to the cloud provider’s network are considerable. This usually involves a customer switching or routing in a meet-me facility where the cloud service provider also has network-edge infrastructure at their disposal. The cabled connection means packers are able to travel directly from the client network to the cloud network and with no need to be traversing the Internet.

Direct connects will darn near guarantee that loss and jitter don’t occur. As long as WAN latency is favorable then performance gets as close as possible to an inside-to-inside connection. The only downside might be when direct connects are pricey compared to simple internet connectivity, and have only large-denomination bandwidths of 1Gbps or higher available.

Exchanges for Multiple CSPs

Separating big physical connections into smaller virtual connections at a broad range of bandwidths all under 100Mbps is possible now and extremely effective as a wide-reaching means of cutting back on cloud storage needs. It becomes possible for a single enterprise client to make a direct physical connection to the exchange, and provisions virtual direct connections over it to reach multiple CSPs through the exchange. A single physical connection for multiple cloud destinations is now all that’s needed.

Most enterprises use multiple cloud providers, and not just one. Most use more all the time and many will never be 100% migrated to cloud even if a good portion of them already are. This makes closing the gap between on-premises resources and cloud resources a part of the ongoing challenges as well but fortunately the different sets of options for addressing the challenges have evolved and improved quite notably in recent years.

Stalkerware an Ever Increasing Threat for Mobile Devices

It has been quite a while since we made any type of cybersecurity threat the focus for our blog entry, but one that is really prominent and increasingly so these days is mobile Stalkerware. Stalkers are different from spies, and Stalkerware is different from spyware. As the name suggests, what it involves in uninvited following and tracking of your activities and whereabouts. It can be just as disconcerting digitally as it is in real life, and more specifically what they do is record conversations, location, and pretty much everything you type.

So yes, they pretty much nix any aspect of privacy at all that might have – and expect to have with the operation of your mobile device. And the problem is that all of this occurs while you have no idea you’re being ‘stalked’. These types of malware often comes disguised as calendar or calculator apps and ‘Flash Keylogger’ is the most infamous one that was busted for it and is fittingly nowhere to be found these days.

Passing on information that their clients can use to keep themselves cyber-safe is going to be agreeable for any reputable Canadian web hosting provider, and that certainly applies for us here at 4GoodHosting too. So we’re taking this entry to talk about why Stalkerware is even more of a problem now than before, and then we’ll conclude by talking a little bit about what you can do to get rid of it.

3x The Risk Now

Apparently there is 3x the risk of being a victim of Stalkerware as compared to what it was three years ago. The possibility of encountering this form of mobile malware has gone up 329% since 2020. These attacks involve the attacker stealing the physical and online freedom of the targeted person, and doing that by tracking their location and monitoring their smartphone activity without consent or the victims being aware it’s going on.

One of the biggest risks can be with valuable information that may be exchanged in a text message, for example. Stalkerware may also be installed secretly on mobile phones by people who have grudges or ill will towards a person, and there have even been instances where it is concerned parents who are being a Stalkerware infection on a device. This is not only about stealing personal data, there are also tangible implications concerning the safety of the individual targeted.

As mentioned above, Stalkerware commonly imitates benign apps such as notes, calculators, or ones similar to these. This allows them to stay hidden in plain sight, with the victims seeing the apps every day on their phone and not thinking much of them. Sometimes they are advertised as apps used to keep a close eye on children and other people that are unable to take care of themselves.

Detection & Removal

The most reliable way to make sure your devices aren’t carrying Stalkerware is to go through all of the apps installed on the device and make sure they all work as intended. A phone that suddenly drops in performance, or starts crashing and freezing for no apparent reason may have a Stalkerware app installed on it. Another indicator can be if suddenly you have a new browser homepage, new icons on your desktop, or a different default search engine.

There are 3 best ways to get rid of Stalkerware on your phone. First is to conduct a factory reset of the device. That’s not something you should do unless you’re aware of what you stand to lose, but if you do decide to it’s important to first back up all important data on your phone: your videos, contacts, photos, etc. You can do this using your phone’s default cloud service, or use something like Google Drive to back up your data.

Your next choice would be to update your operating system. Some Stalkerware is only designed for older operating system versions and so an OS update might disable the stalkerware installed on your device. The Stalkerware may still continue to operate even after an OS update though. If your device has an OS update available, you’ve really got nothing to lose by trying this way.

Last up is using a malware removal app specifically designed for stalkerware. There are lots of good ones, including Norton, McAfee, Bitdefender, and Avira. However, be aware that you’ll need to pay for them.

Use of Sovereign Clouds for Country-Specific Data

Improved Cloud Security a Welcome Reassurance for Cloud-Hosted Websites

Try to travel outside your country of citizenship without a passport and see how far you get. Likely no further than the airport or terminal of either sort, and that is the way it should be considering there are some people – and some things – that aren’t allowed to leave the country. It’s best that they stay in, or it’s best that they aren’t to be elsewhere in the world. The last part of that would apply to anyone who’s a risk to others, but the first part of it can apply to big data too.

There are companies that need to give federal governments the assurance that some of the data they collect from customers or investors doesn’t go beyond their borders, and a good example of this would be manufacturers who work with the Department of Defence or others who have patents on manufactured goods where the government has a vested stake in keeping that technology under wraps and out of the eyes of others in different countries where patent laws aren’t adhered to the same way they are here.

All of this comes at a time when the ongoing shift to cloud computing is as strong as ever, and where the need to not have the bulk and expense of physical data storage is a huge plus for companies of all sorts. This is the aspect that we can relate to here at 4GoodHosting with our being a good Canadian web hosting provider that can see the bigger picture with anything digital despite our being a mere bit player who ensure that a company’s website is always there and open for business or at the very least connection on the World Wide Web.

The need for cloud storage while staying inside laws around domestic data control has led to what are called ‘Sovereign’ Clouds, and that’s what we are going to look at with this week’s blog entry.

Protecting Interests

Banking is another prominent example here of an industry that has been very eager to adopt cloud computing where certain data segments need to be kept within the country. Insurance, healthcare, and the public sector are other ones where complying with laws and requirements within specific regions is important. Sovereign clouds are the type of cloud architecture that has been developed to meet this need.

They are semi-public cloud services that are owned, controlled, and operated by a particular country or region. In some instances that controller may be a cloud provider serving a smaller nation or region. They may be owned by the local government outright or by a consortium of private and public organizations, or owned by private companies that work closely with the government.

The objective of sovereign clouds is to provide computing infrastructure that can support specific government services. Most notably with protecting sensitive data and complying with laws and regulations specific to a country or region. Until not long ago mega cloud providers that served all countries were quite standard, but even with introduction of sovereign clouds ad the shift to them we will likely continue to need the hyperscalers for some systems that are less cost-effective to run on sovereign clouds.

It’s always going to be the case that sovereign clouds are part of multi-cloud deployments, and having the acceptance of multi-cloud and its flexibility driving new interest in sovereign clouds is what we’re seeing today.

Sovereign Cloud Benefits

Increased control and ownership of data is far and away the most prominent advantage to having big data in a sovereign cloud. It ensures that data is stored and managed in compliance with local regulations and laws, including keeping data in specific countries or regions. Use of public clouds might put you at risk of having your data made available outside of the country, and it might not be a situation where anyone or any group has done something ‘wrong’ to allow that to happen.

Sovereign clouds take that possible risk out of the equation since they physically exist in the country they support. Enhanced security measures are another big part of the appeal for sovereign clouds. They offer more unique encryption, access controls, and network segmentation that also may be tailored to specific countries or regions. Larger public clouds may provide the same or better services, but the way sovereign cloud security systems are purpose-built for a specific country’s laws and regulations results in superiority for supporting data security measures for that country.

Other benefits:

  • Higher service availability and reliability levels in comparison to commercial cloud providers
  • Customizability for meeting the specific needs of a country or organization, including compliance requirements, data storage, and processing capabilities
  • Creation of jobs and supporting local economic development

Sovereign Cloud Drawbacks

It is always possible that a sovereign cloud may not be compatible with other cloud infrastructures, resulting in the chance interoperability and data exchange challenges. Data privacy concerns are legit too, as there’s been plenty of instances across history where governments have taken advantage of having this type of control. Many companies prefer to use global public cloud providers if they believe that their local sovereign cloud could be compromised by the government.

Sovereign clouds may not have the same capacity for speedy adoption of new technologies and services compared to global cloud providers. This might limit their ability to innovate and remain competitive. Sovereign clouds won’t likely be able to offer the same types of services, considering that they don’t have billions to spend on R&D like the larger providers. Flexibility and autonomy may be lessened or compromised as organizations that rely on a sovereign cloud may end up having a dysfunction based on being dependent on the government or consortium operating it.