Improving on Windows 11 Threat Protection

There’s always been two tribes when it comes to computing device preferences, and you’re either a Mac or a PC. Those who prefer Macs will usually have a long list of reasons why they prefer to them, and some will point out their perception of more solidity when it comes to defending against web-based threats. They’re ones you are not going to be able to steer clear of if you’re accessing the web, and that’s why robust virus threat protection and other types of protection are super important no matter when type of device you’re using.

Whether Macs are more secure than PCs certainly hasn’t been proven definitively, and people who prefer PCs will have their own long list of reasons as to why they prefer them. Neither type is completely impervious to threats of these sorts, but recently a lot has been made about the shortcomings of Windows 11 when it comes to device security. Here at 4GoodHosting we are definitely attuned to how this is a top priority for a lot of people, and like any other Canadian web hosting provider we can relate to how it’s not something your brush aside if operating your business means collecting and retaining sensitive data.

Which leads us to the good news we’re choosing to dedicate this week’s entry to – there are ways that users can improve threat protection for Windows 11 devices and they are not overly challenging, even for people who aren’t the most tech savvy.

Minimal Protection Built In

Windows 11 is an upgrade on its predecessor when it comes to devices security, and particularly with TPM and Secure Boot plus the guarantee of future security updates that come with them. The problems is that TPM and Secure boot only protect against two types of threats, and the effectiveness of it is entirely related to hardware configuration. If detection can’t be done based on the signature of the BIOS drivers and their relation to the OS then you’re out of luck when it comes to threat detections.

So here are the threats, and what you can do to improve security on a Windows 11 device to defend against each of them more effectively:

  1. Social Engineering

Actions taken on your PC determine your level of risk. Clicking on links, downloading files, installing programs or plugging in external USB drives without using caution and judgment isn’t wise. Doing so can create the problems that security hardware and software try to shield you from. And just because you received it from a trusted source doesn’t mean the link, program, or drive itself is to be trusted.

The same can be said for making personal information available, like your birth date, location, phone number, social security number, and so on. This is because it can be used to gain unauthorized access. And many times when something does occur the biggest part of the headaches is in how that access is to your linked Microsoft account and other services. You should also be sure to not store certain kinds of sensitive information in a non-encrypted file (e.g., Word doc) or share it via non-encrypted forms of communication – email or text message.

  1. Viruses and malware

Malware can be a major source of problems for devices run on Windows 11, and the truth here is that the best defense against those threats is to be careful with your daily routine. But you will still need quality antivirus software for Windows 11, and Windows Security is functional enough as Microsoft’s packaged solution coming with the operating system. For basic internet security it’s fine, but for anyone who’s usage needs or inclinations have them more exposed to threats it is just not sufficient enough.

Choosing to install 3rd-party software is an option, but you may not need to assume that expense. Some people choose to augment Windows Security with a more malware-specific program that provides a little more protection. Don’t go overboard with layering them though, as they can end up conflicting with each other and being less effective as a result.

  1. Open Incoming Ports

With Windows 11 the user will need to keep access to incoming ports blocked in order to prevent being exploited through them. But going with no firewall on your PC is the same as leaving a house with all of its doors not only unlocked, but actually wide open. When incoming ports are left completely exposed anyone on the internet can then attempt to exploit services on your computer available through those ports. When that happens successfully, you’re going too have problems.

The firewall will close them up and many home routers have a built-in hardware firewall. However, you can’t fully rely on them and individual device protection is needed to go along with network-oriented protection. Windows 11 provides sufficient built-in firewall protection, but you need to make sure it is turned on in the Windows Security app.

  1. Data Leaks

It is actually impossible to entirely stop data from being leaked onto the web, and the reality is breaches and leaks are an unavoidable part of life. Windows 11 may have an acceptable level of security, but if the password you have for your linked Microsoft account is the same one used for other services then the basic protections that come with the OS aren’t going to save you from unauthorized account access.

Piece of advice #1 here is not to reuse passwords. And when creating them you should come up with a strong, random, and unique password for every service and website used, plus immediately changing your password for any location where there’s been a breach or leak. Password managers are a good choice – they can keep track of all of those random character strings in a way that’s safe and you don’t need to remember them individually.

Two-factor authentication is also good for beefing up defenses against data leaks. It may be the second step to the login process is what ends up thwarting attempts to access your account. The most secure method is a hardware dongle, but most of you will determine using a mobile app that generates a code provides an ideal balance between security and convenience.

  1. Spying on your Internet Traffic

Every network will have the data being requested and sent to individual devices on display if the individual knows where to look (packet sniffing). When a network is more open it is easier for this to happen. Public Wi-Fi networks are the worst for this risk, and particularly when data in not encrypted. In that scenario the exact information you’re transmitting may be visible too and that can be a big problem obviously.

If data being transmitted is sensitive, then a VPN is the best choice. It will create a secure tunnel that your traffic is funneled through. Use a VPN on your devices when on public Wi-Fi networks and you’ll be MUCH better protected.

Continuing Merits of Tape Storage for Petabyte Data

Obsolescence is real, and it’s an unavoidable reality for nearly all type of technology eventually. Even what is especially practical today will likely one day become useless, and as it has often been said ‘you can’t stop progress.’ When it comes to the digital world and the ever-greater demands we have for data storage, the way the Cloud has started physical storage down the road to obsolescence is definitely a good thing, and especially considering that physical data storage comes with a whole whack of costs that go profoundly beyond what it costs to lease the space.

The migration from tape storage to cloud has been underway for the better part of 2 decades now, and here at 4GoodHosting we are like any good Canadian web hosting provider in that we know all about the pros and cons of data storage means given the nature of what we do for our customers and the fact that we have 2 major data centers of our own in both Vancouver and Toronto. Cloud storage is the way of the future, and all things considered it 100% is a better choice for data storage.

The merits of tape storage for certain types of data continue to exist, however, and in particular it has a lot going for it when it comes to storing petabyte data. If you don’t know what that is, we can explain. You almost certainly know what a gigabyte is, and how there’s 1000 of them in a terabyte. Well, a petabyte is 1,024 terabytes. So needless to say we’re talking about a very large amount of data, but what is that makes tape storage preferable in some instances with this data? Is it just the sheer size of it that is the primary factor?

This is what we’ll look at with the week’s entry, and why the use of tape storage resists going entirely extinct.

Slow to Dwindle

Here in late 2021 only 4% still use tape as their only backup method, and all the while the use of cloud and online backups has gone up to 51%. It is estimated that 15% use a combination of disk and tape. It’s easy to list out what is inferior about tape storage; it is difficult and slow to eliminate completely due to the years of historic backups needing to be kept. Smaller businesses are the ones that can often get away from it freely and switch to a new method without much hassle.

For larger firms, however, and those with compliance requirements it is still quite common to need to retain tape storage. Many times this can be because of regulations pertaining to the operation of the business. Some companies don’t like what transferring costs and manpower required to manage two backup methods while older retentions expire is going to entail, and this has them sticking with tape storage too.

Cost considerations are definitely a big drawback to making a wholesale switch, and that’s because tapes can be very expensive. Let’s consider when cloud backup services were introduced how the high cost of disk storage and bandwidth made the service prohibitively expensive. With greater incorporation has come lower costs and that in turn has made cloud storage even more appealing.

Demand = Supply = Lower Costs


Another reason is that tapes are incredibly inexpensive. When cloud backup services were introduced, the high cost of disk storage and bandwidth made the service too expensive for most. As storage and bandwidth costs have plummeted, online or cloud backup has become increasingly accessible. Tape storage becoming more and more archaic and less and less in use means the cost of it hasn’t gone down at all.

Even if tape is still less expensive (and it is), the benefits of automation, control and reliability make cloud backups less pricey in the long run along with offering obvious peace of mind with knowing data isn’t stored in a physical data center that will have risk factors the cloud doesn’t. Smaller organizations that still have extensive data storage needs for multiple petabytes of data will find that the cost different difference between tape and the Cloud becomes quite significant.

Physical Air Gap

Another plus for tape backups is that they offer the benefit of being physically separate and offline from the systems being protected. In many ways this is kind of like reverting to an older, offline technology to thwart anyone with malicious plans who isn’t familiar with that technology. There are methods to logically ‘air-gap’ and separate cloud backups from your production environment, but they don’t have that reassuring nature that some people like when they’re able to have a tangible version of something.

All in all, though, the idea of relying on a degradable magnetic storage medium isn’t wise for most people and the primary reason they will want to upgrade to a more modern solutions is for automation and reliability. Keep in mind as well that tape backups are a very manual process. They need to be loaded, collected, and transported to off-site storage location for data storage.

Slow but Sure Shift

The industry consensus is that tapes will not stop being used any time soon. Tape storage is expected to continue to be the lowest cost storage option for the foreseeable future, and it is true that tape sales to hyperscale data libraries does continue with the same numbers as have been seen over the last decade and beyond.

With more data moving to the cloud all the time, cloud providers are going to need to offer even more competitive low-cost storage. The lowest cost archive tiers of storage offered by all the major cloud providers use some amount of tape storage, even if you’d guess they don’t. For data storage in the petabytes, there’s still a lot to be said for it.

Heads Up: New Android ‘Ultima SMS’ Subscription Scam

151 isn’t the biggest number in the world, but it’s not the smallest either. 10.5 million? That’s a big number indeed and where we are going with this is that there in an ongoing fraud campaign making its way around the web right now called Ultima SMS that people should be made aware of given the sheer scale of it and just how many people could be affected. 151 is the number of Android apps that this campaign has been identified with (so far) and that 10.5 million figure is the number of times those specific apps have been downloaded.

Most malware is much more deliberately malicious, but that’s not to take away from the seriousness of Ultima SMS and why people should be made aware of it (and why we’re choosing to make it our subject this week). Here at 4GoodHosting we’re like any good Canadian web hosting provider in that we know that people don’t like surprises when they’re the type that end up costing them more money. That’s what makes the Ultima SMS subscription scam so noteworthy – it upgrades users to premium subscription memberships without them being aware of it.

Now the question obviously becomes what would be their gain in doing this. They get a cut from the monies gained by increasing subscription rates involuntarily. Instead let’s look at those very newsworthy scam, as security concerns related to apps downloaded 10+ million times definitely makes it newsworthy.

Gone – Just Not Quickly Enough

The good news here is that Google wasted no time in removing the apps, but those multi-million downloads have worked out to millions of dollars in fraudulent subscription charges already. The way they drew unsuspecting users to the bait was with discount apps, games, custom keyboards, QR code scanners, video and photo editors, spam call blockers, camera filters, and more.

Once one of the affected apps was launched for the first time and using mobile data, the location and IMEI is changed to match the language of the country. The app would then prompt the user to enter their mobile phone number and email address to become aware of the program’s features and gain access to them.

Then once the phone number is obtained along with the required permissions, the app proceeds to subscribe the victim to a $40 per month SMS service. And, as mentioned, the scammers get a cut as an affiliate partner. It’s also recently been determined that the app authors have put into place a system that hits the victim with the maximum charge amount based on their location.

The sheer volume of submissions is what’s making this work, as apparently many of the apps and their ‘offerings’ aren’t particularly good in the first place. The aim is to have a constant inflow of unsuspecting victims and preserving their presence on the Play Store despite the constant reporting and take-down actions.

Some Spots Worse

Not surprisingly, it’s not a scenario where the entire world is being affected by this equally. The countries that are currently most affected by the Ultima SMS scam are:

  • Egypt
  • Saudi Arabia
  • Pakistan
  • UAE

So while we can safely assume there’s a whole lot of unwanted premium subscriptions going on in the Middle East and moving into South Asia, it’s also estimated that nearly 200,000 devices are affected in North America.

Uninstalling the app will prevent new subscriptions from being made. However, it will not prevent the existing subscription from being charged again. This is where the hang-up is, you need to contact your carrier and ask for a cancellation of all SMS subscriptions.

Best Avoidance Practices

Falling victim to this kind of stuff can happen to anyone, and if it does you’ll be best to smarten up regarding avoiding online pitfalls like this one. Here is what industry experts say are best practices for doing that:

151 isn’t the biggest number in the world, but it’s not the smallest either. 10.5 million? That’s a big number indeed and where we are going with this is that there in an ongoing fraud campaign making its way around the web right now called Ultima SMS that people should be made aware of given the sheer scale of it and just how many people could be affected. 151 is the number of Android apps that this campaign has been identified with (so far) and that 10.5 million figure is the number of times those specific apps have been downloaded.

Most malware is much more deliberately malicious, but that’s not to take away from the seriousness of Ultima SMS and why people should be made aware of it (and why we’re choosing to make it our subject this week). Here at 4GoodHosting we’re like any good Canadian web hosting provider in that we know that people don’t like surprises when they’re the type that end up costing them more money. That’s what makes the Ultima SMS subscription scam so noteworthy – it upgrades users to premium subscription memberships without them being aware of it.

Now the question obviously becomes what would be their gain in doing this. They get a cut from the monies gained by increasing subscription rates involuntarily. Instead let’s look at those very newsworthy scam, as security concerns related to apps downloaded 10+ million times definitely makes it newsworthy.

Gone – Just Not Quickly Enough

The good news here is that Google wasted no time in removing the apps, but those multi-million downloads have worked out to millions of dollars in fraudulent subscription charges already. The way they drew unsuspecting users to the bait was with discount apps, games, custom keyboards, QR code scanners, video and photo editors, spam call blockers, camera filters, and more.

Once one of the affected apps was launched for the first time and using mobile data, the location and IMEI is changed to match the language of the country. The app would then prompt the user to enter their mobile phone number and email address to become aware of the program’s features and gain access to them.

Then once the phone number is obtained along with the required permissions, the app proceeds to subscribe the victim to a $40 per month SMS service. And, as mentioned, the scammers get a cut as an affiliate partner. It’s also recently been determined that the app authors have put into place a system that hits the victim with the maximum charge amount based on their location.

The sheer volume of submissions is what’s making this work, as apparently many of the apps and their ‘offerings’ aren’t particularly good in the first place. The aim is to have a constant inflow of unsuspecting victims and preserving their presence on the Play Store despite the constant reporting and take-down actions.

Some Spots Worse

Not surprisingly, it’s not a scenario where the entire world is being affected by this equally. The countries that are currently most affected by the Ultima SMS scam are:

  • Egypt
  • Saudi Arabia
  • Pakistan
  • UAE

So while we can safely assume there’s a whole lot of unwanted premium subscriptions going on in the Middle East and moving into South Asia, it’s also estimated that nearly 200,000 devices are affected in North America.

Uninstalling the app will prevent new subscriptions from being made. However, it will not prevent the existing subscription from being charged again. This is where the hang-up is, you need to contact your carrier and ask for a cancellation of all SMS subscriptions.

Best Avoidance Practices

Falling victim to this kind of stuff can happen to anyone, and if it does you’ll be best to smarten up regarding avoiding online pitfalls like this one. Here is what industry experts say are best practices for doing that:

  • Stay vigilant – be wary of apps advertised in short and catchy videos
  • Disable premium SMS options with your carrier – by doing this you’ll be well defended against anything similar, and this is a really smart move in general if your children handle your device from time to time
  • Check reviews – written reviews may reveal the true purpose of an app
  • Hold off on entering your phone number – if you don’t trust an app you should choose to not share personal details with it
  • Go over fine print – it is helpful to know that legitimate apps almost always have a Terms of Service and a Privacy Policy, as well as a statement about how user submitted information will be used
  • Use Official App Stores only  – As mentioned, the offending apps are no longer on the Google Play Store, but you can be sure they’re still able to be found elsewhere

Cloud Technology for Enterprise Security

Cyber security is certainly turning out to be one of the buzzwords of the 21st century, or at least so far and depending on whether you’re in certain circles. It is also certainly something that anyone running a business would have never even heard of or had to consider in the days before businesses started to up shop along the Information Superhighway. But nowadays nearly all of them who have taken their business online will be all to familiar with terms like malware, ransomware, and the like. There’s always going to be bad actors but nowadays they’re a whole lot more inconspicuous in the digital space.

There’s been plenty of documented cases where big businesses have taken big hits because of cyber attacks, but fortunately cloud computing technology has really stepped up to be a valuable and powerful ally in the fight against cyber crime. The reason this is noteworthy for us here at 4GoodHosting is the nature of what we do lends itself to taking an interest in something that will definitely be a forefront issue for those websites hosted and in operation because of e-commerce interests. We imagine this would be true for any good Canadian web hosting provider.

So we figured this would be an excellent topic to delve into for this week’s entry, and explain in more detail how having the benefits of cloud computing on the side of Enterprise security is such a big plus for anyone who has reasons to be concerned about how air-tight their website and data related to their business operation is or is not depending on the circumstance.

New Rules

Enterprise organisations today have no choice but to play by a new set of cyber and physical security rules. There’s not stopping the advances hackers are making, and they will continue to find new and faster ways to get past security protocols. It’s fair to say now that traditional models where cybersecurity and physical security teams operate as separate entities can no longer ensure being defended against the most recent and revamped threats. Converged cyber and physical security teams are great, but a key piece of a successful security strategy is having the right technology and tools.

The Cloud is a counter to those expanding risks, and we are beginning to see how platforms and software that run in the cloud are making it easier for businesses to identify breaches and take action much more speedily. There are 5 different ways cloud-based technology is making a real positive difference with cyber and physical security strategies today.

So let’s have a look at those

  1. Cross-Platform Integrations

The rise of IoT and cloud-based tech has enabled progress towards better communication system like nothing else that might have come before it. Integrating security systems and software tools creates a more unified platform that is easier to manage and control. Integrated systems also take the weight off IT teams, bringing multiple systems into a single dashboard. We’re seeing how integrating access control and video surveillance systems allows teams to visually verify events as they are happening, with real-time video paired with all access activity.

Access control, video surveillance, alarm systems, building management, identity management and provisioning, and cybersecurity tools are among the best cloud-to-cloud integrations that promote the streamlining of business operations.

2. Auto Patching and Software Upgrades

When it comes to mitigating a security breach every minute is crucial. Outdated technology can mean it taking hours or even days only to determine the breach has happened, and with nothing in the way of assessing the damages and running audits as necessary. Among the cloud’s best benefit in this regard is that firmware and software updates can be completed over-the-air, making it so that systems are always running the latest security features for optimum protection from the newest vulnerabilities.

Automatic updates are also integral to making enterprise security technologies are evergreen. Rather than needing to replace hardware every 2 or 3 years, the latest features are available with a few clicks. We see businesses using a cloud-based keyless entry system rolling out features and product updates that can thwart modern threats and safety hazards, and that happens without physical hardware upgrades or anything else of the sort being required.

3. Remote Data and Control Access

The new realities of running a business means that having staff onsite for every single task just isn’t practical or doable. When operations that can be done remotely a business becomes more agile and adaptable. Being able to decentralise operations across multiple locations is hugely beneficial for cyber and physical security. The Cloud offers exceptional remote access compatibility, and this has inherent benefits with the way it also makes better security protocols less demanding for the people who have to see to them.

4. Leveraging IoT Automations

No one’s going to argue that automations are the future, and manual processes are going to be few and far between in the future. Scalable business that get a leg up because of that scalability are going to be the ones successful with automations. Cloud-based cyber security systems are generally more dynamic and require less labor to execute automations. For example, a scalable security strategy should include automatic alerts for access events, the ability to automatically disable and deactivate old credentials or accounts, along with automatic alerting and routing for emergency procedures.

Intuitive cloud-based software can promote a simple rules engine that makes setting up these automations super simple, and they can be easily adjusted at any time.

5. AI-based Monitoring / Detection

Artificial intelligence is expanding in leaps and bounds, and installing Al systems delivers the benefit of having technology that will learn based on your specific business and trends. Then when it is paired with integrations it becomes a scenario where AI-powered detection tools can help identify security issues faster, and improve response accuracies.

Today’s AI-powered video surveillance can detect nearly anything. Smarter analytics go beyond monitoring, as AI analytics tools work very well for identifying key trends in attendance and space usage. This is then helpful for making key business decisions. When it comes to meeting business goals and ROI, the right security tools are an important factor in reducing costs while creating more efficient and scalable systems along with them.

More Repair Options for PCs on Way for 2023

Neither of the two giants in Apple and Microsoft do much in the way of making their devices easily repairable or upgradeable, and while trying to keep their stuff proprietary as much as possible is understandable it’s not good how so many PCs and other computing devices are discarded and end up as electronic waste instead of being repaired. The basics of electronic device repair aren’t that difficult to get, and you might be surprised what can be done with know how, a steady hand and some soldering skills.

Working on devices that are able to access the web is a huge part of daily life for so many people, and it will be beneficial to try and limit the amount of e-waste we create when getting rid of ones that could still have a longer working life. This is why it’s good news that Microsoft has announced that they are going to make desktop and notebook PC repair much more accessible to people. This will also have huge benefits for providing fully functional computing devices to developing regions of the world where they will assist with education and other interests.

Trying to minimize their environmental footprint is a priority for any quality Canadian web hosting provider in the same way it is for all businesses these days, and at 4GoodHosting we see the value in making people aware of news like this that is in line with environmental interests related to digital devices. E-waste is a problem, and it is going to be very beneficial if people can have their computers and other devices repaired more easily so they don’t have to keep buying new ones and furthering the cycle.

Around a Trend

A large portion of the carbon emissions associated with the devices we own are made during manufacturing. Replacing products before the more real end of their working life causes those emissions, pollution, natural resource use, and land degradation associated with extracting and refining raw materials go way up and there is more toxic e-waste polluting the environment in places like Agbogbloshie, Ghana and Guiyu, China.

The White House is already moving towards legislation that will have the US FTC dismantling repair restrictions around phones and electronics, and this is something that has long been needed here in North America and around the world. It’s also about ensuring that lower income families or individuals can have the same degree of web connectivity to go along with the basic rationale of being able to repair something you use as a tool in the same way you do your motor vehicle.

Both take you to destinations in a sense. The reason you’re soon going to be able to take Microsoft products to 3rd-party repair services OR fix them more easily yourself is because of As You Sow, an activist group that promotes companies being more aware of the environmental degradation levels that come excessive e-waste resulting from the shortened lifespans of devices. They were able to make this request as part of an original shareholder resolution that they were entitled to present.

Their request is that Microsoft analyze the environmental benefits of making its products easier to repair, and now Microsoft is promising to ‘expand the availability of certain parts and repair documentation beyond Microsoft’s Authorized Service Provider network.’ They are also going to offer new mechanisms to enable and facilitate local repair options for consumers, allowing them to have their Microsoft devices repaired outside what is now a limited network of authorized repair shops.

Right to Repair Movement

Just this summer US President Joe Biden issued an executive order instructing the Federal Trade Commission to craft new rules around addressing unfair anticompetitive restrictions on third-party repair and as of right now 27 states are looking at passing right-to-repair bills, and New York has introduced the first-ever national right-to-repair bill that targets all sorts of consumer products that should be repairable if parts are made more readily available from the manufacturer.

A similar type of request has been made to Apple, and industry experts say it is very likely that all major manufacturers will need to be able to prove they are operating in a more ecologically friendly manner. All sort of consumer electronics should be made easier to fix yourself, and although that will mean fewer products being produced and sold it really is high time that something like this happens considering just how problematic planned obsolescence and the like really are.

We are definitely fans of the Right to Repair Movement, and we’re happy to see that there are similar movements here in Canada that are pushing for the same sort of outcomes. If you don’t already have a soldering iron at home, it might be time to get one.

All About Handshake Domain Names

Ever since the web was in its infancy and URLs were just starting to be a thing, internet names that are TLDs (Top Level Domains) are administered by ICANN, a centralized organization that has outlived its usefulness for managing internet names in the opinion of many knowledgeable people in the industry. It’s only very recently that legitimate alternatives to this monopoly of-sorts have come into existence, but the one that’s really generating some buzz these days is Handshake.

It is the exact opposite of ICANN, and in particular with the way it is a decentralized naming solution for the Internet that is powered by blockchain technology – another major disruptor in the industry that we’ve also touched on here on a number of different occasions. HNS is the abbreviation for the Handshake naming system, which is a peer-to-peer network and decentralized system using blockchain as a means of offering better control, freedom, and security of the domain and website.

As you’d expect, this sort of development is the type that comes up immediately on radar for those of us here at 4GoodHosting in the same way it would for any good Canadian web hosting provider that likes to have its thumb on the pulse of web hosting technology and options that become available to people who need to claim their spot on the web and use it to their personal or business advantage. The appeal of HNS naming is that it is line with decentralizing the web and allowing for a more fair reorganizing of the Internet.

So how does Handshake domain naming work, and what exactly make it better for individual users? That’s what we’ll look at this week.

Handshake Domains – How Do They Work?

Let’s start here with a basic refresher on domain names. All websites accessible on the Internet are found on servers identified using Internet Protocol (IP) addresses. Users aren’t expected to know IP addresses, so internet names are mapped to their corresponding servers by means of a domain name system (DNS). DNS is not centralized, but the ultimate control of names via the DNS system is held by a limited number of interest groups and they don’t always act equitably.

The Handshake name system is entirely different by design. While it also maps names to IP addresses and can be utilized in essentially the same way as the traditional DNS, names are administered by a blockchain model instead of a single centralized entity. What is key here is how Handshake takes decentralized control of the root zone and can then be used for so much more than just mapping to servers in the internet space.

As a decentralized, permissionless naming protocol where every peer is validating and in charge of managing the root DNS naming zone, Handshake meets a much more agreeable vision of how the control of TLDs is made available in a more fair system and one that doesn’t favor some greatly at the expense of others.

It’s really starting to emerge as an alternative to existing Certificate Authorities and naming systems, and it’s a darn good thing.

Distribution of Handshake Names

There is more of a chance with name ‘squatting’, and the Handshake protocol reserves the top 100K domain names according to Alexa.com as well as giving priority on existing TLDs to current owners. As a result and to use one example, Google – which currently leases google.com from Verisign, the controller of the .com TLD – can instead lay a claim to the ‘Google’ name via the Handshake blockchain.

This can be applicable for less competitive domain names too, with the blockchain facilitating name auctions which can be bid on by anyone who is in possession of Handshake tokens. This would deliver a very different owner, user, and visitor experience right across the board, but what is interesting to note is that with an HNS the internet user would be navigating to a website in an entirely decentralized manner and with nothing in the way of censorship related to a centralized authority.

Entities that are currently in existence and able to take domain names away from owners under the current ICANN style of governance would be rendered powerless by a Handshake domain name system powered by blockchain. If you’d like to learn more about uncensorable domain names you can find quite a bit of information out there.

Accessing a Handshake Name Using my Browser

You need to be behind an HNS resolver to access a Handshake name in any internet browser. This is possible with running your own HNS resolver on your device. You can also choose to configure your browser to use a DNS-over-HTTPS server that resolves Handshake names. Easyhandshake.com is one example of such a server and people with even a little bit of domain hosting savvy can easily figure out how to start using DNS-over-HTTPS to resolve Handshake names.

Several developers have rolled out browser extensions to allow standardized access to Handshake sites. Bob Wallet and LinkFrame are examples of two available for Google Chrome, and for Mozilla FireFox you’ll find that Resolvr works very well. Last mention here will be for Fingertip – an open-source, lightweight HNS resolver developed by Impervious and compatible with both Mac and Windows OS.

Unlocked Greater Cloud Data Value

There has been so many different types of constraints put on the digital operations of businesses during the pandemic that to list them all would be too much of a task. Ranging from inconveniences to full impediments, the problem with all of them was magnified so much by the concurrent new reality that so many people were changing the way they interacted with these businesses themselves based on their pandemic realities. There have been estimates that upwards of 80% of businesses in North America would have faced severe realities if they hadn’t been able to utilize cloud computing to get past these issues.

Here at 4GoodHosting we’re like any reliable Canadian web hosting provider in that we are just as wrapped up in the shift to the cloud when it comes data. What’s true is that smaller businesses are now coming to terms with the way physical limitations related to data can slow their own operations and / or profitability too, and the rise in their numbers plus the demands that come with their needs has meant that new offerings like cloud data warehouses are a priority for those with the means of designing and offering them.

Innovation is spurred in part this way, and there’s been so much of it over recent years when it comes to non-physical data storage and the applications of it as it pertains to business operations. Fortunately those responsible for these innovations tend to not be the type to rest on their laurels, and that’s why we’re seeing more and more value in cloud data being unlocked for use by businesses.

Talking about these types of topics always comes with surprising and encouraging examples of how new technologies are being implemented, so let’s get right into some of them as well as talking more about how cloud data infrastructure and application continues to get better.

Better Scaling, Better Speed

It is also true that nowadays business leaders are under increasing pressure to make decisions with more in the way of speed and scale as well as collaborating in real-time to adapt to change with maximum effectiveness. Despite all of this, many companies are still having difficulty leveraging their data in the cloud, which limits progress and inhibits reaching their full potential.

Cloud-first analytics is where this has needed to go for a long time, and now it’s finally moving in that direction. Cloud data warehouses are now more common and more popular, especially with the way they allow businesses to better leverage data using a powerful and intuitive data management platform. This can be very pivotal in the transformation of business operations to meet new operating realities – and independent of the type of business in most cases.

That’s because the vast majority of businesses still currently use on-site data platforms that increasingly don’t meet the needs of the business based on how users / clients / customers have their expectations. These limitations can be related to complexity, lack of scalability/inadequate elasticity, rigid monthly costs regardless of use, inability to consolidate siloed data, or an inability to share data inside and outside the business.

Fixes need to be with cloud data platform solutions featuring applicability across use cases and locations, and it seems that developers have finally made the connection between theory and practice there. At least to the point that workable solutions are starting to be rolled out, but of course this is going to be a work in progress for a long time.

Speedy & Unimpeded Movement

The many new and different working realities these businesses have aren’t exclusively related to between organizations like in the past. Locations and applications are a part of the equation now too, and free and fast data movement is key to enabling fast decision making. Before the cloud this would be done via file transfers, and the issue there was way too much latency constraining options for builders and reducing efficiency to the point that it was a deal breaker in some cases.

Being able to share, govern, and access is a huge plus and cloud data platforms enable a data marketplace where organisations can use the technology to be more assured in making certain decisions regarding the direction of their business. The ability to infuse external data into their own data in real-time to forecast business impacts, predict supply and demand, apply models, and more is hugely beneficial.

Better and More Accurate Insights

Cloud-native data environments make it so that business data can be more intelligently matched to what customers need most based on their dynamic. Bringing data together and serving it back through dashboarding allows for data transformation without moving it. Nothing more is required in the way of extra resources, physical infrastructure, or teams and this allows businesses to see how they can best serve their customers and have better and more accurate foresight into what customers will want from them in the future.

Research bears out the merit in this – companies that use data effectively have 18% higher growth margins and 4% higher operating margins, and in the healthcare industry in particular there have been many noted use cases where advanced cloud data management principles and infrastructures have been revolutionary in creating better case outcomes and making service and care so much better right across the board.

We’ll see much more in the way of advancements related to cloud data management in the coming years, and there’s no doubt that some of them will relate to web hosting in Canada more than others. That means we stand to benefit, and the needs of increasing numbers of customers with more in mind for what they’d like to have as their web presence will be addressed too.

Easy Cloud Access May Increase Data Security Risk

It’s been said many times that you can’t stop progress, and that’s true to the point that it may be one of the more applicable maxims around these days. Especially when it comes to technology, as there’s no way any degree of stepping backwards is going to be tolerated if advances mean real benefits. Acronyms are a challenge for many, but even if you have the slightest amount of digital savvy you’ll know that SaaS stands for Software as a Service and its one of many examples where cloud computing technology has made the hassles of hardware installation a thing of the past.

Here at 4GoodHosting we’ve had firsthand benefits from the Cloud and how it’s removed the need for a lot of physical hardware and infrastructure is something any Canadian web hosting provider will be able to relate to. As a collective user base we’re certainly not going to approve of any regression here either, but more and more we’re learning how there are security risks related to cloud infrastructure. That’s not news, and the fact that ease of access increases that risk probably doesn’t come as a surprise either.

But that’s the truth of the situation, and it’s something worth looking into, especially as businesses are flocking to software-as-a-service applications with the aim of improving the efficiency of their operations and overall employee productivity. The question is though – is weak control of access to cloud apps putting those organizations’ data at risk?

1.5x Exposure on Average

There was a recent study that showed that the average 1,000-person company using certain SaaS apps is likely exposing data to anywhere from 1,000 and 15,000 external collaborators. Similar estimates from it suggested between hundreds of companies if not more would also have access to a company’s data, and around 20% of a typical business and their SaaS files might be available for internal sharing with little more than the click of a link.

What can be taken away from that is that unmanageable SaaS data access is a legit problem that can apply to businesses of any size these days.

Last year, slightly more than 40% of data breaches occurred as the result of web application vulnerabilities according to this report. Nearly half of all data breaches can be attributed to SaaS applications, and seeing as how more and more businesses rely on these softwares, it is legitimately a huge threat. Especially when you consider that many companies store anywhere from 500k to a million assets in SaaS applications.

This looks to be even more of a problem in the future. The incorporation of SaaS services is predicted to grow, with revenues expected to jump a full 30% over the next 3+ years to 2025.

COVID Factor

This growth has and will continue to be accelerated by the new working realities the COVID pandemic has created for us. This is because SaaS application are easy to set up and don’t require the same outlay of time and resources for an IT department. The way businesses can identify problems and procure solutions on their own and within a timeframe that works for them is a huge plus.

Add to that as well the shift to working remotely for so many people and having the ability to access a SaaS from anywhere and on any device is something that is going to be pushing the appeal of Software as a Service for a long time yet to come. And in the bigger picture that is definitely a good thing.

This goes along with massive increases in the adoption of cloud services, choices made for all the same reasons and a similar part of the new digital workplace reality for a lot of people. Many organizations that had this shift in mind had their timetable accelerated because of the pandemic and the new need for the ability to have team members working remotely.

Software Visibility Gap

In the early 2000s there was a trend where free and small-scale SaaS offerings were still something of an unknown but at the most basic level they were very agreeable because they met needs very well and offered more speed and agility compared to conventional and standard options. They often really improved business results, and that’s why they took off from there.

But since then the meteoric growth in adoption has introduced problems, and in many ways they were ones that industry experts foresaw – even back then. Unmanaged assets will always pose some degree of risk, and by making it so that ease of access is expected from the user base they’ve also created the possibility of greater data insecurity.

This is what creates a software visibility gap, with the cloud obfuscating the inner workings of the applications and the data stored in it and blurring the insight into potential attacks to the point that security measures can’t be validated for effectiveness in application the same way.

Problems with Data Everywhere

Cloud and SaaS platforms as they exist for the most part today make it so that the corporate network is no longer the only way to access data, and access gained through 3rd-party apps, IoT devices in the home, and portals created for external users like customers, partners, contractors and MSPs make security a much more complicated and challenging process.

It’s perfectly natural that companies are eager to use these access points to increase the functionality of their cloud and SaaS systems but going in full bore without understanding how secure and monitor them in the same way may lead to major access vulnerabilities that are beyond the capacity of the organization to identify and prepare against.

It’s entirely true that unmanaged SaaS usage means that sensitive corporate data may make its out of the house and do so long before those in charge of security become aware of the extent of the problem and what they might do to minimize the damage done.

When we consider further that SaaS applications often integrate with other SaaS applications the risk is magnified even further.

Responses in Progress

Organizations are making an effort to reduce the risk posed to their data by SaaS apps without stifling speed, creativity and business success, but it’s not an easy fix at this point by any means. Security and IT teams cant’ depend exclusively on in-house expertise to have the security measures they need in place in a timely manner. Or at all. With increasing complexity of cloud and SaaS environments companies will need to use automated tools to ensure that their security settings are in line with business intent, along with continuous monitoring of security controls to prevent configuration drift.

Device Prices Set to Go Up Due to Chip Shortage

Anyone who knows of a quality smartphone that checks all the boxes and comes with an agreeable price tag can speak up and volunteer that information now. A good one that’s not going to be obsolete within a year or two is going to cost you, and it would seem that given recent worldly developments in the digital hardware sphere it might be that even the less expensive smartphones, laptops, and desktop computers are going to be going up in price quite a bit too. We’re entering an inflationary period in North America right now, but that’s not why prices on devices are shooting up.

It’s mostly related to how international chip makers are hamstrung in their ability to make the semiconductor chips in the same quantities they made them for years. Take a look at any of the very newest smartphones on the market and your enthusiasm for them based on the features is quickly slowed when you see how much they cost. If you’re a person who’s fine with a more standard and ordinary device this trend isn’t going to bother you too much, but if you’re all about the latest and greatest in technology – be prepared to pay quite a bit more for it.

There’s no getting around the basic principle of supply and demand with pretty much any consumer product in the world. It turns out this applies to components too, and when it comes to what enables these devices to work their magic the demand is now outdistancing the supply of them like never before. Any Canadian web hosting provider like us here at 4GoodHosting have our own operating constraints related to demand outstripping supply too, but it’s different when it’s the individual consumer who’s faced with the prospect of paying a LOT more when it’s time to upgrade or replace.

Wafers Wanted

Wafers aren’t only snacks, and in fact they are integral part of the chips that are so needed by mobile and computing device manufacturers these days. What’s happening now is that recent increases in wafer quotes by major manufacturers means there’s going to be a serious impact on the price of actual hardware, including cell phones and a broad range of everyday consumer hardware. It’s believed that this is going to result in more consumers being to buy lower-end hardware.

If you’re not familiar with the role these parts play, modern PCs and smartphones usually contain one or two key chips (CPU, GPU, SoC) made using the most advanced chip tech, like a leading-edge or advanced node. The foundries which make the chips have already increased pricing for their customers. Until recently most chip designers and other firms that make the finished products were hesitant to pass the price hikes on to their customers so entry-level and mainstream products were still agreeable to price-sensitive customers.

Now though the cumulative cost increases for some chips from 2020 to 2022 will be 30% or even more. It’s not possible to avoid passing this increase up the supply chain as margins are already very thin and these companies will not be okay with losing money. The expectation now is that chip designers will increase the prices they charge OEMs, and that will filter down to the end products in 2022.

Bigger BOM Costs

BOM is an acronym for Bill of Materials, and if vendors are going to pass on these higher wafer prices to OEMS then the estimate is that high-end smartphone BOM cost increases will be around 12% for 2022. The average BOM cost for a high-end smartphone is usually around $600. But what’s interesting is the cost for entry-level ones could have their BOM cost affected even more. They could see a 16% increase.

So an anywhere from 12 – 16% increase in BOM cost can create a major impact on a device’s recommended price, and experts say these factors will keep pricing high for years to come. Making chips using leading-edge fabrication technologies like TSMC’s N7 and N5 or Samsung Foundry’s 7LPP and 5LPE is very pricy due to contract chip makers charging up to 3x more for processing wafers using their latest nodes.

Investments in this hardware as part of technology advances are usually made long before those chips start to earn money. It’s for this reason that only a handful of companies in the world can afford leading-edge processes.

It’s also forecasted that over the next few years technologies will remain mostly inaccessible for the majority of chip designers, and even rather advanced chips will still be produced on 16nm and 28nm-class nodes but with 10% to 18% increases in wholesale pricing attached to them.

Demand, and More Demand

The demand seen for all electronics devices is already high than ever these days and emerging and powerful trends like 5G, AI, and HPC all mean that the demand for chips will only get bigger. Experts foresee supply balances not coming around until mid-2023, and adding to all that is the fact that demand for equipment designed for lagging-edge nodes is growing faster than demand for tools aimed at leading-edge nodes. The same nodes that won’t be part of the newer technology chips that major manufacturers are going to be focused on producing.

Adding to this further is that major chip foundries have increased their quotes for 40/45 nm, 55/65 nm, 90 nm, and larger nodes multiple times since mid-2020. This is going to mean that the price of a wafer processed using 90nm technology will increase by 38% in 2022. Again, prices will be passed on to consumers.

The fact that these foundries have utilization rates above 100% nearly all the time means they spend more time processing wafer and less time working on their maintenance too. They will be even more reluctant to drop prices even when demand-supply balance stabilizes.

More Will Go for Entry-Level Devices

Price-sensitive customers who buy higher-end smartphones and PCs may instead choose entry-level devices unless different midrange products appear on the market. The GPU market went through something similar not long ago. This happening with more popular devices like iPhones, Pixels, and Galaxies is quite likely.

This is because the price increases on chips made using mature nodes will affect the end costs attached to all devices. For high-end PCs and smartphones these additional costs won’t affect their recommended prices much at all. But for mainstream devices these additional costs may have a drastic effect on MSRP. Many buyers may feel they have to look past even midrange products and consider buying entry-level instead.

Servers – Why Bare Metal May Be Better


Most people don’t know the workings of what goes into their being able to surf the Internet and visit web pages. That’s perfectly fine unless you’re a developer or something similar. When you click on URL what you’re doing is making a request, and that request is handled by a server. Back in the 1990s when the Internet was in its infancy there were requests being made, but nowhere near the massive numbers of them being made nowadays. This is why servers have been having a lot more asked of them all the time, and sometimes they just don’t have the capacity that’s needed for them.

Need and demand have always been the spurs for innovation, and this is no exception. The aim has been to design servers that have the ability to handle the ever-greater demands on them, and these days the top dog in that regard is a bare metal server. That may sound like a strange name, but they’re called bare metal servers because by being just ‘exposed metal’ they highlight the fully physical aspect of centralized and individual hosting of websites.

That’s because the appeal of bare metal servers is all about ‘single tenancy’ and having the best in performance, reliability and security. It means that your website will be as readily available as possible for visitors at all times, and of course if that site is a key part of your e-commerce business operations then having that performance, reliability, and security is going to be of primary importance for you. Here at 4GoodHosting it should come as no surprise that as Canadian web hosting provider this is the kind of stuff we are very in the know about, so let’s get further into why bare metal tends to be best when it comes to servers.

  1. Better Relative Costs

The performance of on-premises servers and bare metal servers is fairly similar. The biggest cost savings come with datacenter space for hardware as well as data center power costs. These cost savings can be significant, and both offer varying degrees of quality of service. You will pay more upfront for a bare metal server, and that’s because they’re pretty much exclusive to the client in terms of dedicated resources.

What you get for that is unparalleled performance, hardware configurations, and nearly unlimited scalability. The next advantage is that bare metal server providers often offer per-hour billing plans rather than full service contracts paid in advance for the entirety of the term. Bare metal servers may seem pricier, but this is offset by the advantage of paying only for what you use.

  1. More Features for Business

Bare metal servers can be utilized and provide advantages for any business. That’s primarily because of the ability configure them exactly how you want them to be before deploying bare metal servers. That can be done in a range of hardware configurations plus plenty of virtualization environments, but bare metal servers let businesses custom build their own servers with impressive flexibility and customization options. A business that is building bare metal servers can create new operating systems, virtual operating systems, or convert an existing operating system to a virtual environment.

Eliminating hardware redundancy is the next part of the appeal. One example being how a bare metal server can be used to ensure that the server’s power is never turned off, which could mean less in the way of server downtime.

  1. More Agreeable Cost Factors

The biggest determining factor for most when considering a bare metal server will be around the cost for top-end hardware and features. You’ll be evaluating which equipment they need, what features they require, and how much redundancy might still be needed with the hardware. The more you pay for your server the more you’ll have with available cores, powerful hardware configurations, and more available RAM.

One thing to factor into cost savings is the reliability of bare metal servers when it comes to downtime. This reliability comes from the fact that by not sharing the server with other renters, you’re not going to experience the kind of downtime that can cost a company when virtualized servers are the choice.

  1. Software Configurations are Usually Less Pricey

Software configuration is another important part of the equation. A bare metal server configurated with dedicated hardware components and high-end graphics is going to be quite the powerhouse unit. If that can be acquired not-so expensively that’s going to be a huge plus. Businesses will be considering how much they want to dedicate their resources to the maintenance and support of this server. Some companies are really interested in expanding their virtualization and virtual computing space, and a good virtualization platform can make that simpler process.

  1. Bare Metal Server Management and Overhead Costs

The management and general overhead costs for bare metal servers is much the same compared to virtualized servers. This usually isn’t a dissuading factor for decision makers, depending on what the organization wants from its servers.

We can consider how a bare metal server can have faster response times to the network than a virtualized server, something that definitely will be advantageous. The networking team can set up new server builds quickly and can configure the server with the features that they need in the shortest amount of time because a bare metal server can be set up quickly and without making any changes to the virtualization platform.