What to Expect from the Top Dog in Early 2021 Chrome Update

Reading Time: 4 minutes

A lot of people who are Google devotees will want to say otherwise, but Microsoft has definitely met the bar with their Edge browser seen on most of their newest offerings. The near archaic Internet Explorer is just a blip in the rear-view mirror now, and then there’s perfectly good browsers in Safari and Firefox too. But Google Chrome is still the top dog on the block and more people choose to use it as their primary browser and that’s not likely changing anytime soon.

The numbers will bend slightly one way or another, but generally it’s about 65% of all browser usage worldwide that Google Chrome accounts for. Next up is Firefox at around, and then Edge at 5% and Safari at 3.5% or so. Edge is definitely going to gain some ground here, and we can expect all sorts of study findings for 2020 coming out not too far into the new year.

Here at 4GoodHosting we imagine we’re much the same as any Canadian web hosting provider right about now in that our thoughts are primarily around the holidays that are here and enjoying Christmas with our families. Each new year brings new and upcoming developments and seeing as how web browsing is something everyone is big on nowadays we thought we’d talk this week about what we can expect to be new and fresh with Chrome when it comes out with its new update releasing on January 19, 2021.

Outsized Impact

Because such a large majority of device users prefer Chrome it’s fair to say that Chrome’s changes have an outsized impact. Google’s plans get the attention of everyone from individual users and IT admins to competitors and open-source allies.

We can expect this Chrome update to be similar to previous ones where the Chrome update is accompanied by enterprise-centric release notes that highlight some of the additions, deletions, enhancements and modifications that are on their way. Here’s what we know.

New Permission Chip

Google is setting up Chrome 88 with a new permissions request that their referring to as a ‘chip’ to make it different it from the usual pop-up prompt. This is on their understanding that a small UI element at the left end of the address bar (located next to the padlock icon) is a less intrusive spot.

Chrome 87 users can see the chip in action by typing chrome://flags, searching for #permission-chip, changing field to “Enabled” before a relaunch of the browser.

You may also see a new permissions ‘chip’ as a blue oval at the left of the address bar. This is going to be a replacement for the intrusive pop-up requests that are more and more common with some websites.

No More Yosemite

It’s one of the US National Parks that is truly a spectacle of nature, but Yosemite the OS is getting put out to pasture with the Chrome 88. Google will end support for Chrome with the 2014 OS version. Apple has already stopped serving Yosemite-powered Macs with security updates nearly 3 years ago.

Chrome 88 on the Mac will require OS X 10.11, El Capitan, or later.

End of Legacy Browser Add-On

Google will disable all installed instances of the extension for the Legacy Browser Support (LBS) add-on from the Chrome Web Store. LBS is now fully incorporated into Chrome to allow IT admins to deploy Google’s browser.

Tab Search

Chrome 88 is going to enable users’ searching open tabs via a click on the symbol to the right of the + in the tab bar, and following it with a search string. These searches can be conducted in all Chrome windows, not just between the tabs in the active Chrome window.

Shortened Display URLs

Google will roll out a truncated URL in the address bar of Chrome 88 that’s a little shorter, much like last time around. As an example https://google-secure.example.com/secure-google-sign-in/ would display the registrable domain by itself – example.com.

The stated idea is to protect users from standard phishing strategies like when criminals try to trick potential victims into clicking on links – ones that look legitimate at first look but are intentionally misleading.

Incompatibility with Old PCs

If anyone you know hasn’t pulled the plug on an old beater of a PC then maybe now is the time. Chrome 89 and up will require x86 processors with SSE3 support. Chrome will not install and run on x86 processors that do not support SSE3 – Streaming SIMD Extensions 3 on the Intel chipset.

As a general timeline this is going to affect many computers that were manufactured before2004.

New Root Store

Google is switching from relying on the operating system’s built-in certificate verification to its own implementation for Chrome on all the browser’s supported platforms (minus iOS).

The idea is that with its own certificate root store Google could ensure users have a consistent experience across platforms, and developers get a consistent understanding of Chrome’s behavior. Plus, it’s believed that Chrome will be better able to protect the security and privacy of users’ connections to websites. This root store might not arrive until Chrome 90 though, which should arrive in early March 2021.

Merry Christmas to Everyone and we’ll see you again once before New Year’s.


Major 2020 A.I. Milestones

Reading Time: 4 minutes

We’re not flying in airborne vehicles like the Jetsons predicted we would be by 2020, but we do have robots and artificial intelligence – or A.I. as it’s abbreviated for convenience – has factored into robotic technology very heavily over the last few years. Now it’s true that no household has the equivalent of ‘Rosey’ taking care of domestic duties and whatnot, but is there reason to think we might all have our own Rosey in the not-too-distant future?

Could be.

We’re really no longer in the time where A.I. was in its fledgling stage. It’s definitely starting to get wings, and the premise for what’s possible with artificial intelligence is really something. Of course there’s going to be a real need for measures and protocols in place to keep it on the ‘good’ side of things, but man oh man there’s reason to be excited about all the potential here.

And it’s the type of potential that is pretty enticing for anyone who works or provides services in the digital space, and that of course includes us here at 4GoodHosting along with every other Canadian web hosting provider. This kind of stuff always makes the grade with what we’ll consider newsworthy, and so we thought it would be worthwhile to point out major milestones for artificial intelligence that we’ve seen across the year that’s drawing to close now.

Here are six of the main developments and emerging themes seen in artificial intelligence during 2020.




Language Comprehension

Take your average year and a text-generating tool likely won’t make the cut as one of the most exciting new A.I. developments. But this is 2020 (the year the Jetsons were living in if you didn’t get the cartoon reference earlier) and GPT-3 is no ordinary text-generating tool.

For those in the know, it’s the follow-up for GPT-2, at the time labeled as the ‘world’s most dangerous algorithm’ with all the sensationalism you care to attach to that. GPT-3 is a cutting-edge autoregressive natural-language-processing neural network created by the research lab OpenAI. All it needs is the first few lines at the beginning of a news story to then generate impressively accurate text matching the style and content of the initial few lines. It can even generate that quotes that fit as naturally as those you’d get from a breathing interviewee.

Parameters are the connections that are tuned in order to achieve performance. GPT-3 boasts an astonishing 175 billion parameters and is reportedly running up a bill of about $12 million to train.

Microsoft’s Turing Natural Language Generation (T-NLG) can get a shout out here too. After debuting in February 2020 it was a massive 17 billion parameters in size and the largest language model to date. T-NLG is capable of generating the necessary words to complete unfinished sentences, as well as generate direct answers to questions and summarize documents.



Bigger Models

GPT-3 and T-NLG themselves are part of another milestone too. Extensive resource utilization is – as you’d expect – the norm with A.I. development and along with that comes enormous models with huge training costs. Neural networks with upward of a billion parameters are fast becoming the norm.

And more parameters are required all the time.

Look no further than new models like Meena, Turing-NGL, DistilBERT, and BST 9.4B all surpassing 1 billion parameters. More parameters doesn’t necessarily mean better performance in every case. But they do make text-generating tools much more capable with a large range of functions. That’s important, because of real brain-like artificial intelligence is the aim then more parameters are a must.




Philanthropy of Sorts

It’s not only computer scientists who are especially pleased about all of this. Researchers from other disciplines have utilized A.I. to benefit the less fortunate of humanity too. Examples are A.I.-backed technologies diagnosing tinnitus from brain scans or mind-reading headsets that use machine learning to help vocally impaired wearers turn thoughts into words.

The best example here may be DeepMind’s AlphaFold and its contribution to bio sciences. It’s impressively accurate with predicting the shape of proteins based on their sequence, potentially helping develop new more effective therapies rapidly. This is all really good stuff.




Major Advances in Robotics

As intimidating as the thought of them might be, truth is we need major advances in robotics to live more sustainably on the planet. There’s no shortage already of examples of A.I. and robotics carrying out human tasks, but the real advances have been with robotics designed to be able to be trusted with control functions at locations where living or even an extended stay for a human is impractical or impossible.

Robotics really is one area that is set to explode and powered in doing that by A.I.





Deepfakes aren’t a 2020 invention, but they sure have stolen a lot of attention this year and for good reason. There’s so much wonder that’s natural when you consider what this technology is capable of, and even more so when you think about what’s possible when any deepfake rendering is then given its own A.I. to both look AND act the part.

In July of 2020, researchers from the Center for Advanced Virtuality at the Massachusetts Institute of Technology created a deepfake video that was incredibly real in its appearance and compelling with the way it showed former President Richard Nixon delivering an alternate address about the moon landings. For a man that’s been dead for many years, he sure did look like his old self in the video.




Regulation of A.I. Will be Needed

With power comes the need for responsibility, and that is certainly going to apply to all the A.I.-powered tools we’re going to see over the next few years. Take for example how Police in Detroit wrongly arrested a man after an algorithm mistakenly matched the photo on his drivers’ license with blurry CCTV footage. This then lead to IBM, Amazon, and Microsoft all announcing they would reevaluate the use of their facial-recognition technologies.

This will very much apply to deepfakes too. California passed the AB-730 law designed to criminalize the use of deepfakes and make it clear that those with the technology aren’t to use if for no good.

All this A.I. stuff is really fascinating, and we are as enthused as anyone to see what 2021 has in store along the same lines.

5 Tips: Better Ways for Web Scraping

Reading Time: 4 minutes

We imagine that for most of you the only scraping you do is part of your battle with baked-on food when doing dishes. Or maybe at the bottom of the peanut butter jar before washing it clean and recycling it. Most won’t be able to associate scraping with anything related to their desktop, but as always we like to talk about subjects that are relatable for web-savvy people with our blog.

Truth told, however, those of us here at 4GoodHosting aren’t experts on any aspect of any particular part of it all outside of being a Canadian web hosting provider. Then why would anyone be ‘scraping’ the World Wide Web, especially if there’s no peanut butter to be had in the first place? Joking of course, but we imagine there’s a whole lot of people who don’t know what this would be to go along with those who are already engaged in web scraping but would be interested in how to do it better.

So let’s get to it.

What It Is

Web scraping is a process, and it is usually automated. The aim for it is to extract large amounts of data from websites. Web scraping gather all the information/data from particular sites or specific data as per whatever requirements may be, and is usually done by companies and brands for data analysis, brand monitoring, and market research. Most commonly with an overarching goal to foster the brands’ growth and development.

The problem for most is that it’s not done easily. Quite often there are IP blocking and geo-restrictions that serve as impediments to doing it. These are of course related to security, which is in-built on many websites. However, it’s true there are ways to scrape better and more effectively. The most common of these tips is using residential IP proxy for higher security, but there are other solid ones too.

Popular sites will have build features that incorporate techniques and strategies to prevent developers from scraping them. IP address detection is definitely the most common of them. Many larger ones may also have IP address detection tools that prevent their website being scraped by suspicious IP addresses. Other data scrape prevention methods include CAPTCHAs, HTTP request header checking, javascript checks, and others.

There are ways to get past those blocks, and that’s what leads us to the next part of this discussion here

5 Tips for Better Web Scraping

Using Proxies

It’s smart to use different proxies to perform web scraping as a means of preventing your IP address from being blocked. If your IP address can be easily detected, it’s probably going to be blocked. It’s also true that using jus one IP address to scrape websites makes it easier for websites to track your IP address and then block it.

Using proxies that offer higher security is the best way to solve this issue. Proxies mask or hide your real IP address to make it difficult to detect. Proxies also provide you with multiple IPs that you can use for web scraping, and being from diverse locations they usually get past geo-blocking or geo-restrictions.

All sorts of different kinds of proxies exist, but residential IP proxies are the best for web scraping because they’re difficult to flag as proxies due to being traced back to actual physical locations. Identifying or banning them is difficult.

IP Rotation

If all the requests for scraping come from the same IP address then that IP address will almost certainly get banned with a site’s IP detection provisions working as they should. However, what if you use several different IPs for sending web scraping requests? This works, because it becomes difficult for websites to trace so many different IPs at the same time. You can get around being identified this way.

IP rotation is essentially switching between different IP addresses. Rotational proxies are automated proxies that switch your IP address every 10 minutes. This constant switching allows you to perform web scraping without the possibility of being IP blocked.

Random Intervals between Data Requests

Implementing random intervals to occur between data requests is proven-effective trick for performing web scraping to the extent you want. Websites can detect your IP address much more easily if you send data requests at fixed or regular intervals. If you use web scrapers capable of sending randomized data asks, however, it becomes much more difficult to identify your IP and block it.

Utilize a Captcha Solving Service

You probably already have experience with these, having to confirm your ‘I’m not a robot’ identity before access to a website is possible. Captchas as the most common technique and Captcha solving services can be used to scrape data from such sites. There are different services available for Captcha solving, such as narrow Captcha, Scraper API, and many more. For most it’s not difficult to find one that fits their needs if there’s data scraping to be done.

Check for Honeypots

Many websites have honeypots preventing unauthorized use of that sites’ information. What is a Honeypot? It’s an invisible link that is used to stop hackers and web scrapers from extracting data from websites. Performing honeypot checks is something you need to do if you’re going to scrape a site. Choose not to and you’re probably going to be blocked.


Even if you are in the know about best ways to do it, web scraping remains difficult at the best of times. Using a residential IP proxy is one of the most commonly used strategies to prevent IP blocking and it’s both the most effective and easily-done approach of all the ones listed here.


Internet Expands to 370.7 Domain Name Registrations

Reading Time: 3 minutes

1991 sure was a long time ago, and yet it was at a certain point nearly 30 years ago now that there was quite literally one – and only one – person on the World Wide Web. Tim Berners-Lee certainly isn’t a household name the way many other technology founders are, but perhaps he should be. If you don’t know of him and can’t see where we are going with this, he’s the man who created the Internet.
A pivotal moment in human history for sure, and especially noteworthy seeing as how his creation has turned out to be an immeasurable blessing for global connectivity and productivity. On the other side, however, it’s done great harm to many people too so maybe it’s the proverbial ‘double-edged sword’ in that regard. I know I’m sure as shoot glad the Internet didn’t exist when I was a teenager.
That said, the significance of what Internet connectivity offers in the way of instant and wide-reaching access to information certainly isn’t lost on us here at 4GoodHosting. Like any good Canadian web hosting provider, the sheer volume of customers we provide with hosting for their sites indicates that people are immensely receptive to what being online can do for their businesses, ventures, or personal interests / avocational pursuits.
More and more people are looking for reliable web hosting in Canada every day, and that’s going to be the same for any country in the world really. Which is the ideal segue into the news we’ll share as our topic this week – not surprisingly, the Internet continues to be increasingly populated!

Big Jump

The last quarter of this current year saw 370.7 million domain name registrations across all top-level domains (TLDs). That’s an increase of 0.6 million domain name registrations as compared to the second quarter of the year. All this according to VeriSign, who’s a global provider of domain name registry services and Internet infrastructure.
Want an understanding of just how much of a priority people are putting onto going online? Look no further than the fact that domain name registrations have grown by 10.8 million, or 3.0 percent, year over year. No matter what you’re counting, 10.8 ‘new ones’ on average every year is some pretty darn explosive growth.

.com and .net Dominance

This part shouldn’t come as a surprise to the same extent. Country-specific domain extensions like .ca for here in Canada are popular, but not nearly the same way .com and .net ones are. These two had a combined total of 163.7 million domain name registrations in the domain name base as the 2020 3rd quarter ended.
This works out to an increase of 1.7 million domain name registrations, or 1%, compared to the second quarter of 2020. Of those, 6.3 million domain name registrations were for the .com and .net TLDs and that’s worked out to a 4% increase, year over year.
The .com domain name base comes in at 150.3 million domain name registrations, while 13.4 million .net domain names have been registered.

New Registrations for Domain Names

New .com and .net domain name registrations are always arriving too, and between them 10.9 million were registered this year to the end of the 2020 third quarter. That’s in comparison to 9.9 million domain name registrations at the end of the third quarter of 2019.
While these numbers are fairly natural and expected, plus a good sign for the health and vitality of e-commerce around the world, there are increased logistical challenges that come with so many domains with sites hosted on the web.
We would be qualified to talk about those logistics in great detail given the nature of what we do here, but instead we’ll conclude here this week by saying that the best thing that could happen would be if there was a collective movement worldwide to hold people to certain standards regarding how and for what purpose they use the Web.
Lofty thinking for sure, but it’s a part of the bigger picture of what we believe in when it comes to doing good in the world. We’re lucky to have the Internet, and we’d do well to strive to keep it working for the collective good at all times.