Chromium Manifest V3 Updates May Disable Ad Blockers

It’s likely that a good many of you are among the thousands upon thousands of people who have an Ad Blocker installed for your web browsers of choice. Some people do use them simply to avoid the nuisance of having to watch ad after ad, and it’s people like these that have necessitated some sites to insist that you ‘whitelist’ them in order to proceed into the website they want to visit. That’s perfectly understandable, as those paying advertisers are the way the website generates income for the individual or business.

For others, however, we spend a great deal of our working day researching and referencing online, and having to watch ads before getting to the content we need in order to do our work. For us, an ad blocker is much more of a tool of necessity rather than convenience. Still, we get caught up in more than a few sites that will insist on being whitelisted too. For me, my ad blocker is a godsend and I don’t whitelist any website or disable my ad blocker for any of them.

Here at 4GoodHosting, part of what makes us a good Canadian web hosting provider is having built up an insight into what really matters to our customers. The bulk of them are people who use the Information Superhighway as a production resource rather than web ‘surfers’ for whom it’s more of an entertainment one. That’s why today’s news is some that’s sure to be very relevant for most of our customers.

Weakened WebRequest APIs

Some of you may not know how your ad blocker works, and that’s perfectly normal. As long as it does its job, you don’t really need to know. Chromium is Google’s newest all-powerful web browser, and just like Chrome did you can expect it to soon become nearly ubiquitous as most people’s web browser of-choice.

However, Chromium developers in the last few weeks have shared that among the updates they are planning to do in Manifest V3 is one that will restrict the blocking version of the webRequest API. The alternative they’re introducing is called declrativeNetRequest API.

After becoming aware of it, many ad blocker developers expressed their belief that the introduction of the declarativeNetRequest API will mean many already existing ad blockers won’t be ‘blocking’ much of anything anymore.

One industry expert stated on the subject, “If this limited declarativeNetRequest API ends up being the only way content blockers can accomplish their duty, this essentially means that two existing and popular content blockers like uBO and uMatrix will cease to be functional.”

What is the Manifest V3 Version?

It’s basically a mechanism through which specific capabilities can be restricted to a certain class of extensions. These restrictions are indicated in the form of either a minimum, or maximum, version.

Why the Update?

Currently, the webRequest API allows extensions to intercept requests and then modify, redirect, or block them. The basic flow of handling a request using this API is as follows,

  • Chromium receives the request / queries the extension / receives the result

However, in Manifest V3 the use of this API will have its blocking form limited quite significantly. The non-blocking form of the API that permits extensions to observer network requests for modifying, redirecting, or blocking them will not be discouraged. In addition, the limitations they are going to put in the webRequest API have yet to be determined

Manifest V3 is set to make the declarativeNetRequest API as the primary content-blocking API in extensions. This API will then allow extensions to tell Chrome what to do with a given request, instead of Chromium forwarding the request to the extension. This will enable Chromium to handle a request synchronously. Google insists this API is overall a better performer and provides better privacy guarantees to users – the latter part of which if of course very important these days.

Consensus Among Ad Blocker Developers and Maintainers?

When informed about this coming update many developers were concerned that the change will end up completely disabling all ad blockers. The concern was that the proposed declarativeNetRequest API will result in it being impossible to develop new and functional filtering engine designs. This is because the declarativeNetRequest API is no more than the implementation of one specific filtering engine, and some ad blocker developers have commented that it’s very limited in its scope.

It’s also believed that the declarativeNetRequest API developers will be unable to implement other features, such as blocking of media element that are larger than a set size and disabling of JavaScript execution through the injection of CSP directives, among other features.

Others are making the comparison to Safari content blocking APIs, which essentially put limits on the number of admissible rules. Safari has introduced a similar API recently, and the belief is that’s the reason why Apple has gone in this direction too. Many seem to think that extensions written in that API are more usable, but still fall well short of the full power of uBlock Origin. The hope is that this API won’t be the last of them in the foreseeable nearest future.

Dedicated IP Addresses and SEO

Even the most layman of web endeavourers will be familiar with the acronym SEO. We imagine further there’s very few if any individuals anywhere who don’t know it stands for search engine optimization, and understand just how integral SEO is for having success in digital marketing. Most people with a small business that relies on its website for maximum visibility with prospective customers will hire an SEO professional to SEO optimize their site. That continues to be highly recommended, and for 9 out of 10 people it is NOT something you can do effectively on your own, no matter how much you’ve read online or how many YouTube videos you’ve watched.

Here at 4GoodHosting, we are like any other top Canadian web hosting provider in that we offer SEO optimization services for our clients. Some people will think that choosing the best keywords and having them at the ideal density is most integral to having good SEO, and that’s true and by and large. But there are a number of smaller but still significant influence that influence SEO, and they’ll be beyond the wherewithal of most people.

Whether websites benefit from a Dedicated IP address rather that a Shared IP address isn’t something you’ll hear discussed regularly. When you learn that the answer is yes, they do, and exactly why, however, it’s a switch many people will want to consider if they currently have a Shared IP address. Let’s have a look at why that is today.

What Exactly Is an IP address?

For some, we may need to start at the start with all of this so let’s begin be defining what exactly an IP address is. Any device connected to the Internet has a unique IP address, and that’s true if it’s a PC, laptop, mobile device, or your web host’s server. It’s made up of a 4-number string which will start at 0 and then go up to 255. Here’s an example of one:

This numerical string code makes the machine you are using known. Once it’s identified – and it has to be – the Internet is then able to send data to it. You now can access the hundreds of thousands of websites along the Information Superhighway.

What’s a Shared IP address?

In most instances, the server your web host uses to host your site will be a single machine with a matching single IP address. For most people – and nearly all who go with the most basic hosting package without giving it much thought – you’ll be set up in an arrangement where the server is hosting thousands of websites like yours. It’s not ‘dedicated’ to you and your site exclusively.

Instead, all of the websites hosted it will be represented by the single IP address allocated to the web host’s server. Now if your website is utilized for more of a personal venture or hobby and it’s NOT going to be a leverage point in trying to secure more business, shared hosting will probably be fine. Alternately, if page rankings are a priority for you then shared hosting may be putting you at a disadvantage.

The solution? A dedicated IP address for your Canadian website. If you need one, we can take care of that for you quickly and fairly easily for you. But we imagine you’ll need more convincing, so let’s move now to explaining what constitutes a Dedicated IP address..

The Dedicated IP Address

A dedicated IP address involves you having your own server, and that server only has one website on it – yours. It is common, however, for more than one site reside on a specific server. A Dedicated IP address is an IP address that is allocated to a single website, instead of one being assigned the server and representing every website hosted there by default.

The Purpose of Dedicated IP Addresses

The primary appeal of Dedicated IP addresses is that they promote large ecommerce being more secure, and in particular as it regards sensitive data like credit card numbers, etc. On a more individual scale, though, a dedicated IP address is superior for SEO interests as well.

Why is that? Let’s list all of the reasons here:

1. Speed

When you share space, you share resources and in as far as shared web hosting and shared IP addresses are concerned that means you are sharing bandwidth. The long and short of it is all those other sites on the same server will be slowing yours down. That might be a problem in itself, but if it isn’t then the way slow site speeds push you further down Google’s rankings will be.

While adding a unique IP address to your site will not automatically mean it loads faster, but migrating to a Dedicated Server with a Dedicated IP address definitely will. Sites with a Dedicated IP address are faster, more reliable, and more secure, and that’s a big deal.

2. SSL

For nearly 5 years now Google has been giving preference to websites that have added an SSL 2048-bit key certificate. The easiest way to see whether that’s been done or not is seeing the site’s URL change from HTTP to HTTPS. SSL sites typically utilize unique IP addresses. Google continues to insist that SSL impacts less than 1% of searches, but it’s a factor nonetheless and is another benefit of a Dedicated IP address.

SSL can make your website more visible through public networks and can make websites operate marginally faster, and the benefit of this is in the way visitors get a faster response from the website because it’s not held up by Google the way it would be if it didn’t have an SSL cert. The majority of ecommerce sites with a Dedicated IP address will also have an SSL cert.

3. Malware

Malware is software that’s designed and disseminated for the explicit purpose of throwing wrenches into the gears of a working web system. Unfortunately, the thousands of websites that may be on a shared server drastically increases the risk of being exposed to malware if you’re one of them. Further, when you share an IP address with any site that’s been infected with malware then your site is actually penalized despite the fact it’s not you who’s been infected.

In these cases, you’ll be best served by going with a Dedicated IP address and choosing a more reliable Canadian web hosting provider that has measures in place to protect malware from making its way into the servers in the fist place. A dedicated IP means you’re standing alone, and you’re regarded accordingly.

How Do I Get a Dedicated IP Address?

If you’re with us here at 4GoodHosting, all you need to do is ask. We’ve been setting our customers up with Dedicated IP addresses for quite some time now, and you’ll find that when you do so through us it’s not nearly as pricey as you had expected it to be.

It’s very recommended for any ecommerce site or one that’s utilized for very business-strategic aims, and it’s fair to say that you really can’t go wrong moving to a dedicated server if you’ve made the commitment to do anything and everything to protect your SEO and enjoy the same page rankings moving forward. The vast majority of people see it as a wise investment, and of course you always have option of switching back to a shared hosting arrangement if over time you don’t see any real difference or benefits for you.

Global Environmental Sustainability with Data Centers

Last week we talked about key trends for software development expected for 2019, and today we’ll discuss another trend for the coming year that’s a bit more of a given. That being that datacenters will have even more demands placed on their capacities as we continue to become more of a digital working world all the time.

Indeed, datacenters have grown to be key partners for enterprises, rather than being just an external service utilized for storing data and business operation models. Even the smallest of issues in datacenter operations can impact business.

While datacenters are certainly lifeblood for every business, they also have global impacts and in particular as it relates to energy consumption. Somewhere in the vicinity of 3% of total electricity consumption worldwide is made by datacenters, and to put that in perspective that’s more than the entire power consumption of the UK.

Datacenters also account for 2% of global greenhouse gas emissions, and 2% electronic waste (aka e-waste). Many people aren’t aware of the extent to which our growingly digital world impacts the natural one so directly, but it really does.

Like any good Canadian web hosting provider who provides the service for thousands of customers, we have extensive datacenter requirements ourselves. Most will make efforts to ensure their datacenters operate as energy-efficiently as possible, and that goes along with the primary aim – making sure those data centers are rock-solid reliable AND as secure as possible.

Let’s take a look today at what’s being done around the globe to promote environmental sustainability with data centers.

Lack of Environmental Policies

Super Micro Computer recently put out a report entitled ‘Data Centers and the Environment’ and it stated that 43% of organizations don’t have an environmental policy, and another 50% have no plans to develop any such policy anytime soon. Reasons why? high costs (29%), lack of resources or understanding (27%), and then another 14% don’t make environmental issues a priority.

The aim of the report was to help datacenter managers better understand the environmental impact of datacenters, provide quantitative comparisons of other companies, and then in time help them reduce this impact.

Key Findings

28% of businesses take environmental issues into consideration when choosing datacenter technology

Priorities that came before it for most companies surveyed were security, performance, and connectivity. However, 9% of companies considered ‘green’ technology to be the foremost priority. When it comes to actual datacenter design, however, the number of companies who put a priority on energy efficiency jumps up by 50% to 59%.

The Average PUE for a Datacenter is 1.89

Power Usage Effectiveness (PUE) means the ratio of energy consumed by datacenter in comparison to the energy provided to IT equipment. The report found the average datacenter PUE is approx. 1.6 but many (over 2/3) of enterprise datacenters come in with a PUE over 2.03.

Further, it seems some 58% of companies are unaware of their datacenter PUE. Only a meagre 6% come in that average range between 1.0 and 1.19.

24.6 Degrees C is the Average Datacenter Temperature

It’s common for companies to run datacenters at higher temperatures to reduce strain on HVAC systems and increase savings on energy consumption and related costs. The report found 43% of the datacenters have temperatures ranging between 21 degrees C and 24 degrees C.

The primary reasons indicated for running datacenters at higher temperatures are for reliability and performance. Hopefully these operators will come to soon learn that recent advancements in server technology have optimized thermal designs and newer datacenter designs make use of free-air cooling. With them, they can run datacenters at ambient temperatures up to 40 degrees C and see no decrease in reliability and performance. It also helps improve PUE and saving costs.

Another trend in data center technology is immersion cooling, where datacenters are cooled by being entirely immersed. We can expect to see more of this type of datacenter technology rolled out this year too.

3/4 of Datacenters Have System Refreshes Within 5 Years

Datacenters and their energy consumption can be optimized with regular updates of the systems and adding modern technologies that consume low power. The report found that approximately 45% of data center operators conduct a refreshing of their system sometime within every 3 years. 28% of them do it every four to five years. It also seems that the larger the company, the more likely they are to do these refreshes.

8% Increase in Datacenter E-Waste Expected Each Year

It’s inevitable that electronic waste (e-waste) is created when datacenters dispose of server, storage, and networking equipment. It’s a bit of a staggering statistic when you learn that around 20 to 50 million electric tons of e-waste is disposed every year around the world, and the main reason it’s so problematic is that e-waste deposits heavy metals and other hazardous waste into landfills. If left unchecked and we continue to produce it as we have then e-waste disposal will increase by 8% each year.

Some companies partner with recycling companies to dispose of e-waste, and some repurpose their hardware in any one of a number of different ways. The report found that some 12% of companies don’t have a recycling or repurposing program in place, and typically they don’t because it’s costly, partners / providers are difficult to find in their area, and lack of proper planning.

On a more positive note, many companies are adopting policies to address the environmental issues that stem from their datacenter operation. Around 58% of companies already have environmental policy in place or are developing it.

We can all agree that datacenters are an invaluable resource and absolutely essential for the digital connectivity of our modern world. However, they are ‘power pigs’ as the expression goes, and it’s unavoidable that they are given the sheer volume of activity that goes on within them every day. We’ve seen how they’ve become marginally more energy efficient, and in this year to come we will hopefully see more energy efficiency technology applied to them.

Key Trends in Software Development Expected for 2019

Here we are into the first week of 2019 and as expected we’ve got a whole lot on the horizon this year in the way of software development. We live in a world that’s more and more digital all the time, and the demands put on the software development industry are pretty much non-stop in response to this ongoing shift. Often times it’s all about more efficient ‘straight lining’ of tasks as well as creating more of a can-do environment for people who need applications and the like to work smarter.

Here at 4GoodHosting, a part of what makes us a reputable Canadian web hosting provider is the way we stay abreast of developments. Not only in the web hosting industry, but also in the ones that have a direct relevance for clients of ours in the way they’re connected to computing and computing technology.

Today we’re going to discuss the key trends in software development that are expected for this coming year.

Continuing to Come a Long Way

Look back 10 years and you’ll surely agree the changes in the types of applications and websites that have been built – as well as how they’ve been built – is really quite something. The web of 2008 is almost unrecognizable. Today it is very much an app and API economy. It was only just 10ish years ago that JavaScript framework was the newest and best around, but now building for browsers exclusively is very much a thing of the past.

In 2019 we’re going to see priorities put on progressive web apps, artificial intelligence, and native app development remain. As adoption increases and new tools emerge, we can expect to see more radical shifts in the ways we work in the digital world. There’s going to be less in the way of ‘cutting edge’ and more in the way of refinements on technology that reflect developers now having a better understanding of how technologies can be applied

The biggest thing for web developers now is that they need to expand upon the stack as applications become increasingly lightweight (in large part due to libraries and frameworks like Vue and React), and data grows to be more intensive, which can be attributed to the range of services upon which applications and websites depend.

Reinventing Modern JavaScript Web Development

One of the things that’s being seen is how topics that previously weren’t included under the umbrella of web development – microservices and native app development most notably– are now very much part of the need-to-know landscape.

The way many aspects of development have been simplified has forced developers to evaluate how these aspects fit together more closely. With all the layers of abstraction in modern development, the way things interact and work alongside each other becomes even more important. Having a level of wherewithal regarding this working relationship is very beneficial for any developer.

Those who’ve adapted to the new realities well will now agree that it’s no longer a case of writing the requisite code to make something run on the specific part of the application being worked on. Rather, it’s about understanding how the various pieces fit together from the backend to the front.

In 2019, developers will need to dive deeper become inside-out familiar with their software systems. Being explicitly comfortable with backends will be an increasingly necessary starting point. Diving into the cloud and understanding that dynamic is also highly advisable. It will be wise to start playing with microservices. Rethinking and revisiting languages you thought you knew is a good idea too.

Be Familiar With infrastructure to Tackle Challenges of API development

Some will be surprised to hear it, but as the stack shrinks and the responsibilities of web developers shift we can expect that having an understanding of the architectural components within the software being built will be wholly essential.

That reality is put in place by DevOps, and essentially it has made developers responsible for how their code runs once it hits production. As a result, the requisite skills and toolchain for the modern developer is also expanding.

RESTful API Design Patterns and Best Practices

You can make your way into software architecture through a number of different avenues, but exploring API design is likely the best of them. Hands on RESTful API Design gives you a practical way into the topic.

REST is the industry standard for API design, and the diverse range of tools and approaches is making client management a potentially complex but interesting area. GraphQL, a query language developed by Facebook is responsible for killing off REST, while Redux and Relay – a pair of libraries for managing data in React applications – have both seen a significant amount of interest over the last year as a pair of key tools for working with APIs.

Microservices for Infrastructure Responsibility

Microservices are becoming the dominant architectural mode, and that’s the reason we’re seeing such an array of tools capable of managing APIs. Expect a whole lot more of them to be introduced this year, and be proactive in finding which ones work best for you. While you may not need to implement microservices now, if you want to be building software in 5 years time then you really should become explicitly familiar with the principles behind microservices and the tools that can assist you when using them.

We can expect to see containers being one of the central technologies driving microservices. You could run microservices in a virtual machine, but as they’re harder to scale than containers you likely wouldn’t see the benefits you’ll expect from a microservices architecture. As a result, really getting to know core container technologies should also be a real consideration.

The obvious place to start is with Docker. Developers need to understand it to varying degrees, but even those who don’t think they’ll be using it immediately will agree that the real-world foundation in containers it provides will be valuable knowledge to have at some point.

Kubernetes warrants mention here as well, as it is the go-to tool that allows you to scale and orchestrate containers. It offers control over how you scale application services in a way that would have bee unimaginable a decade ago.

A great way for anyone to learn how Docker and Kubernetes come together as part of a fully integrated approach to development is with Hands on Microservices with Node.js.

Continued Embracing of the Cloud

It appears the general trend is towards full stack, and for this reason developers simply can’t afford to ignore cloud computing. The levels of abstraction it offers, and the various services and integrations that come with the leading cloud services make it so that many elements of the development process are much easier.

Issues surrounding scale, hardware, setup and maintenance nearly disappear entirely when you use cloud. Yes, cloud platforms bring their own set of challenges, but they also allow you to focus on more pressing issues and problems.

More importantly, however, they open up new opportunities. First and foremost of them is going Serverless becomes a possibility. Doing so allows you to scale incredibly quickly by running everything on your cloud provider.

There are other advantages too, like when you use cloud to incorporate advanced features like artificial intelligence into your applications. AWS has a whole suite of machine learning tools; AWS Lex helps you build conversational interfaces, and AWS Polly turns text into speech. Azure Cognitive Services has a nice array of features for vision, speech, language, and search.

As a developer, it’s going to be increasingly important to see the Cloud as a way of expanding on the complexity of applications and processes while keeping them agile. Features and optimizations previously might have found to be sluggish or impossible can and should be developed as necessary and then incorporated. Leveraging AWS and Azure (among others) is going to be something that many developers will do with success in the coming year.

Back to Basics with New languages & Fresh Approaches

All of this ostensible complexity in contemporary software development may lead some to think that languages don’t matter as much as they once did. It’s important to know that’s definitely not the case. Building up a deeper understanding of how languages work, what they offer, and where they come up short can make you a much more accomplished developer. Doing what it takes to be prepared is really good advice for a what’s an ever-more unpredictable digital world to come this year and in years to follow.

We can expect to see a trend where developers go back to a language they know and explore a new paradigm within it, or they learn a new language from scratch.

Never Time to Be Complacent

We’ll reiterate what the experts we read are saying; that in just a matter of years much of what is ‘emerging’ today will be old hat. It’s helpful to take a look at the set of skills many full stack developer job postings are requiring. You’ll see that the different demands are so diverse that adaptability should be a real priority for a developer that wants to remain upwardly mobile within his or her profession. Without doubt it will be immensely valuable both for your immediate projects and future career prospects.