More Repair Options for PCs on Way for 2023

Neither of the two giants in Apple and Microsoft do much in the way of making their devices easily repairable or upgradeable, and while trying to keep their stuff proprietary as much as possible is understandable it’s not good how so many PCs and other computing devices are discarded and end up as electronic waste instead of being repaired. The basics of electronic device repair aren’t that difficult to get, and you might be surprised what can be done with know how, a steady hand and some soldering skills.

Working on devices that are able to access the web is a huge part of daily life for so many people, and it will be beneficial to try and limit the amount of e-waste we create when getting rid of ones that could still have a longer working life. This is why it’s good news that Microsoft has announced that they are going to make desktop and notebook PC repair much more accessible to people. This will also have huge benefits for providing fully functional computing devices to developing regions of the world where they will assist with education and other interests.

Trying to minimize their environmental footprint is a priority for any quality Canadian web hosting provider in the same way it is for all businesses these days, and at 4GoodHosting we see the value in making people aware of news like this that is in line with environmental interests related to digital devices. E-waste is a problem, and it is going to be very beneficial if people can have their computers and other devices repaired more easily so they don’t have to keep buying new ones and furthering the cycle.

Around a Trend

A large portion of the carbon emissions associated with the devices we own are made during manufacturing. Replacing products before the more real end of their working life causes those emissions, pollution, natural resource use, and land degradation associated with extracting and refining raw materials go way up and there is more toxic e-waste polluting the environment in places like Agbogbloshie, Ghana and Guiyu, China.

The White House is already moving towards legislation that will have the US FTC dismantling repair restrictions around phones and electronics, and this is something that has long been needed here in North America and around the world. It’s also about ensuring that lower income families or individuals can have the same degree of web connectivity to go along with the basic rationale of being able to repair something you use as a tool in the same way you do your motor vehicle.

Both take you to destinations in a sense. The reason you’re soon going to be able to take Microsoft products to 3rd-party repair services OR fix them more easily yourself is because of As You Sow, an activist group that promotes companies being more aware of the environmental degradation levels that come excessive e-waste resulting from the shortened lifespans of devices. They were able to make this request as part of an original shareholder resolution that they were entitled to present.

Their request is that Microsoft analyze the environmental benefits of making its products easier to repair, and now Microsoft is promising to ‘expand the availability of certain parts and repair documentation beyond Microsoft’s Authorized Service Provider network.’ They are also going to offer new mechanisms to enable and facilitate local repair options for consumers, allowing them to have their Microsoft devices repaired outside what is now a limited network of authorized repair shops.

Right to Repair Movement

Just this summer US President Joe Biden issued an executive order instructing the Federal Trade Commission to craft new rules around addressing unfair anticompetitive restrictions on third-party repair and as of right now 27 states are looking at passing right-to-repair bills, and New York has introduced the first-ever national right-to-repair bill that targets all sorts of consumer products that should be repairable if parts are made more readily available from the manufacturer.

A similar type of request has been made to Apple, and industry experts say it is very likely that all major manufacturers will need to be able to prove they are operating in a more ecologically friendly manner. All sort of consumer electronics should be made easier to fix yourself, and although that will mean fewer products being produced and sold it really is high time that something like this happens considering just how problematic planned obsolescence and the like really are.

We are definitely fans of the Right to Repair Movement, and we’re happy to see that there are similar movements here in Canada that are pushing for the same sort of outcomes. If you don’t already have a soldering iron at home, it might be time to get one.

All About Handshake Domain Names

Ever since the web was in its infancy and URLs were just starting to be a thing, internet names that are TLDs (Top Level Domains) are administered by ICANN, a centralized organization that has outlived its usefulness for managing internet names in the opinion of many knowledgeable people in the industry. It’s only very recently that legitimate alternatives to this monopoly of-sorts have come into existence, but the one that’s really generating some buzz these days is Handshake.

It is the exact opposite of ICANN, and in particular with the way it is a decentralized naming solution for the Internet that is powered by blockchain technology – another major disruptor in the industry that we’ve also touched on here on a number of different occasions. HNS is the abbreviation for the Handshake naming system, which is a peer-to-peer network and decentralized system using blockchain as a means of offering better control, freedom, and security of the domain and website.

As you’d expect, this sort of development is the type that comes up immediately on radar for those of us here at 4GoodHosting in the same way it would for any good Canadian web hosting provider that likes to have its thumb on the pulse of web hosting technology and options that become available to people who need to claim their spot on the web and use it to their personal or business advantage. The appeal of HNS naming is that it is line with decentralizing the web and allowing for a more fair reorganizing of the Internet.

So how does Handshake domain naming work, and what exactly make it better for individual users? That’s what we’ll look at this week.

Handshake Domains – How Do They Work?

Let’s start here with a basic refresher on domain names. All websites accessible on the Internet are found on servers identified using Internet Protocol (IP) addresses. Users aren’t expected to know IP addresses, so internet names are mapped to their corresponding servers by means of a domain name system (DNS). DNS is not centralized, but the ultimate control of names via the DNS system is held by a limited number of interest groups and they don’t always act equitably.

The Handshake name system is entirely different by design. While it also maps names to IP addresses and can be utilized in essentially the same way as the traditional DNS, names are administered by a blockchain model instead of a single centralized entity. What is key here is how Handshake takes decentralized control of the root zone and can then be used for so much more than just mapping to servers in the internet space.

As a decentralized, permissionless naming protocol where every peer is validating and in charge of managing the root DNS naming zone, Handshake meets a much more agreeable vision of how the control of TLDs is made available in a more fair system and one that doesn’t favor some greatly at the expense of others.

It’s really starting to emerge as an alternative to existing Certificate Authorities and naming systems, and it’s a darn good thing.

Distribution of Handshake Names

There is more of a chance with name ‘squatting’, and the Handshake protocol reserves the top 100K domain names according to Alexa.com as well as giving priority on existing TLDs to current owners. As a result and to use one example, Google – which currently leases google.com from Verisign, the controller of the .com TLD – can instead lay a claim to the ‘Google’ name via the Handshake blockchain.

This can be applicable for less competitive domain names too, with the blockchain facilitating name auctions which can be bid on by anyone who is in possession of Handshake tokens. This would deliver a very different owner, user, and visitor experience right across the board, but what is interesting to note is that with an HNS the internet user would be navigating to a website in an entirely decentralized manner and with nothing in the way of censorship related to a centralized authority.

Entities that are currently in existence and able to take domain names away from owners under the current ICANN style of governance would be rendered powerless by a Handshake domain name system powered by blockchain. If you’d like to learn more about uncensorable domain names you can find quite a bit of information out there.

Accessing a Handshake Name Using my Browser

You need to be behind an HNS resolver to access a Handshake name in any internet browser. This is possible with running your own HNS resolver on your device. You can also choose to configure your browser to use a DNS-over-HTTPS server that resolves Handshake names. Easyhandshake.com is one example of such a server and people with even a little bit of domain hosting savvy can easily figure out how to start using DNS-over-HTTPS to resolve Handshake names.

Several developers have rolled out browser extensions to allow standardized access to Handshake sites. Bob Wallet and LinkFrame are examples of two available for Google Chrome, and for Mozilla FireFox you’ll find that Resolvr works very well. Last mention here will be for Fingertip – an open-source, lightweight HNS resolver developed by Impervious and compatible with both Mac and Windows OS.

Unlocked Greater Cloud Data Value

There has been so many different types of constraints put on the digital operations of businesses during the pandemic that to list them all would be too much of a task. Ranging from inconveniences to full impediments, the problem with all of them was magnified so much by the concurrent new reality that so many people were changing the way they interacted with these businesses themselves based on their pandemic realities. There have been estimates that upwards of 80% of businesses in North America would have faced severe realities if they hadn’t been able to utilize cloud computing to get past these issues.

Here at 4GoodHosting we’re like any reliable Canadian web hosting provider in that we are just as wrapped up in the shift to the cloud when it comes data. What’s true is that smaller businesses are now coming to terms with the way physical limitations related to data can slow their own operations and / or profitability too, and the rise in their numbers plus the demands that come with their needs has meant that new offerings like cloud data warehouses are a priority for those with the means of designing and offering them.

Innovation is spurred in part this way, and there’s been so much of it over recent years when it comes to non-physical data storage and the applications of it as it pertains to business operations. Fortunately those responsible for these innovations tend to not be the type to rest on their laurels, and that’s why we’re seeing more and more value in cloud data being unlocked for use by businesses.

Talking about these types of topics always comes with surprising and encouraging examples of how new technologies are being implemented, so let’s get right into some of them as well as talking more about how cloud data infrastructure and application continues to get better.

Better Scaling, Better Speed

It is also true that nowadays business leaders are under increasing pressure to make decisions with more in the way of speed and scale as well as collaborating in real-time to adapt to change with maximum effectiveness. Despite all of this, many companies are still having difficulty leveraging their data in the cloud, which limits progress and inhibits reaching their full potential.

Cloud-first analytics is where this has needed to go for a long time, and now it’s finally moving in that direction. Cloud data warehouses are now more common and more popular, especially with the way they allow businesses to better leverage data using a powerful and intuitive data management platform. This can be very pivotal in the transformation of business operations to meet new operating realities – and independent of the type of business in most cases.

That’s because the vast majority of businesses still currently use on-site data platforms that increasingly don’t meet the needs of the business based on how users / clients / customers have their expectations. These limitations can be related to complexity, lack of scalability/inadequate elasticity, rigid monthly costs regardless of use, inability to consolidate siloed data, or an inability to share data inside and outside the business.

Fixes need to be with cloud data platform solutions featuring applicability across use cases and locations, and it seems that developers have finally made the connection between theory and practice there. At least to the point that workable solutions are starting to be rolled out, but of course this is going to be a work in progress for a long time.

Speedy & Unimpeded Movement

The many new and different working realities these businesses have aren’t exclusively related to between organizations like in the past. Locations and applications are a part of the equation now too, and free and fast data movement is key to enabling fast decision making. Before the cloud this would be done via file transfers, and the issue there was way too much latency constraining options for builders and reducing efficiency to the point that it was a deal breaker in some cases.

Being able to share, govern, and access is a huge plus and cloud data platforms enable a data marketplace where organisations can use the technology to be more assured in making certain decisions regarding the direction of their business. The ability to infuse external data into their own data in real-time to forecast business impacts, predict supply and demand, apply models, and more is hugely beneficial.

Better and More Accurate Insights

Cloud-native data environments make it so that business data can be more intelligently matched to what customers need most based on their dynamic. Bringing data together and serving it back through dashboarding allows for data transformation without moving it. Nothing more is required in the way of extra resources, physical infrastructure, or teams and this allows businesses to see how they can best serve their customers and have better and more accurate foresight into what customers will want from them in the future.

Research bears out the merit in this – companies that use data effectively have 18% higher growth margins and 4% higher operating margins, and in the healthcare industry in particular there have been many noted use cases where advanced cloud data management principles and infrastructures have been revolutionary in creating better case outcomes and making service and care so much better right across the board.

We’ll see much more in the way of advancements related to cloud data management in the coming years, and there’s no doubt that some of them will relate to web hosting in Canada more than others. That means we stand to benefit, and the needs of increasing numbers of customers with more in mind for what they’d like to have as their web presence will be addressed too.

Easy Cloud Access May Increase Data Security Risk

It’s been said many times that you can’t stop progress, and that’s true to the point that it may be one of the more applicable maxims around these days. Especially when it comes to technology, as there’s no way any degree of stepping backwards is going to be tolerated if advances mean real benefits. Acronyms are a challenge for many, but even if you have the slightest amount of digital savvy you’ll know that SaaS stands for Software as a Service and its one of many examples where cloud computing technology has made the hassles of hardware installation a thing of the past.

Here at 4GoodHosting we’ve had firsthand benefits from the Cloud and how it’s removed the need for a lot of physical hardware and infrastructure is something any Canadian web hosting provider will be able to relate to. As a collective user base we’re certainly not going to approve of any regression here either, but more and more we’re learning how there are security risks related to cloud infrastructure. That’s not news, and the fact that ease of access increases that risk probably doesn’t come as a surprise either.

But that’s the truth of the situation, and it’s something worth looking into, especially as businesses are flocking to software-as-a-service applications with the aim of improving the efficiency of their operations and overall employee productivity. The question is though – is weak control of access to cloud apps putting those organizations’ data at risk?

1.5x Exposure on Average

There was a recent study that showed that the average 1,000-person company using certain SaaS apps is likely exposing data to anywhere from 1,000 and 15,000 external collaborators. Similar estimates from it suggested between hundreds of companies if not more would also have access to a company’s data, and around 20% of a typical business and their SaaS files might be available for internal sharing with little more than the click of a link.

What can be taken away from that is that unmanageable SaaS data access is a legit problem that can apply to businesses of any size these days.

Last year, slightly more than 40% of data breaches occurred as the result of web application vulnerabilities according to this report. Nearly half of all data breaches can be attributed to SaaS applications, and seeing as how more and more businesses rely on these softwares, it is legitimately a huge threat. Especially when you consider that many companies store anywhere from 500k to a million assets in SaaS applications.

This looks to be even more of a problem in the future. The incorporation of SaaS services is predicted to grow, with revenues expected to jump a full 30% over the next 3+ years to 2025.

COVID Factor

This growth has and will continue to be accelerated by the new working realities the COVID pandemic has created for us. This is because SaaS application are easy to set up and don’t require the same outlay of time and resources for an IT department. The way businesses can identify problems and procure solutions on their own and within a timeframe that works for them is a huge plus.

Add to that as well the shift to working remotely for so many people and having the ability to access a SaaS from anywhere and on any device is something that is going to be pushing the appeal of Software as a Service for a long time yet to come. And in the bigger picture that is definitely a good thing.

This goes along with massive increases in the adoption of cloud services, choices made for all the same reasons and a similar part of the new digital workplace reality for a lot of people. Many organizations that had this shift in mind had their timetable accelerated because of the pandemic and the new need for the ability to have team members working remotely.

Software Visibility Gap

In the early 2000s there was a trend where free and small-scale SaaS offerings were still something of an unknown but at the most basic level they were very agreeable because they met needs very well and offered more speed and agility compared to conventional and standard options. They often really improved business results, and that’s why they took off from there.

But since then the meteoric growth in adoption has introduced problems, and in many ways they were ones that industry experts foresaw – even back then. Unmanaged assets will always pose some degree of risk, and by making it so that ease of access is expected from the user base they’ve also created the possibility of greater data insecurity.

This is what creates a software visibility gap, with the cloud obfuscating the inner workings of the applications and the data stored in it and blurring the insight into potential attacks to the point that security measures can’t be validated for effectiveness in application the same way.

Problems with Data Everywhere

Cloud and SaaS platforms as they exist for the most part today make it so that the corporate network is no longer the only way to access data, and access gained through 3rd-party apps, IoT devices in the home, and portals created for external users like customers, partners, contractors and MSPs make security a much more complicated and challenging process.

It’s perfectly natural that companies are eager to use these access points to increase the functionality of their cloud and SaaS systems but going in full bore without understanding how secure and monitor them in the same way may lead to major access vulnerabilities that are beyond the capacity of the organization to identify and prepare against.

It’s entirely true that unmanaged SaaS usage means that sensitive corporate data may make its out of the house and do so long before those in charge of security become aware of the extent of the problem and what they might do to minimize the damage done.

When we consider further that SaaS applications often integrate with other SaaS applications the risk is magnified even further.

Responses in Progress

Organizations are making an effort to reduce the risk posed to their data by SaaS apps without stifling speed, creativity and business success, but it’s not an easy fix at this point by any means. Security and IT teams cant’ depend exclusively on in-house expertise to have the security measures they need in place in a timely manner. Or at all. With increasing complexity of cloud and SaaS environments companies will need to use automated tools to ensure that their security settings are in line with business intent, along with continuous monitoring of security controls to prevent configuration drift.

Device Prices Set to Go Up Due to Chip Shortage

Anyone who knows of a quality smartphone that checks all the boxes and comes with an agreeable price tag can speak up and volunteer that information now. A good one that’s not going to be obsolete within a year or two is going to cost you, and it would seem that given recent worldly developments in the digital hardware sphere it might be that even the less expensive smartphones, laptops, and desktop computers are going to be going up in price quite a bit too. We’re entering an inflationary period in North America right now, but that’s not why prices on devices are shooting up.

It’s mostly related to how international chip makers are hamstrung in their ability to make the semiconductor chips in the same quantities they made them for years. Take a look at any of the very newest smartphones on the market and your enthusiasm for them based on the features is quickly slowed when you see how much they cost. If you’re a person who’s fine with a more standard and ordinary device this trend isn’t going to bother you too much, but if you’re all about the latest and greatest in technology – be prepared to pay quite a bit more for it.

There’s no getting around the basic principle of supply and demand with pretty much any consumer product in the world. It turns out this applies to components too, and when it comes to what enables these devices to work their magic the demand is now outdistancing the supply of them like never before. Any Canadian web hosting provider like us here at 4GoodHosting have our own operating constraints related to demand outstripping supply too, but it’s different when it’s the individual consumer who’s faced with the prospect of paying a LOT more when it’s time to upgrade or replace.

Wafers Wanted

Wafers aren’t only snacks, and in fact they are integral part of the chips that are so needed by mobile and computing device manufacturers these days. What’s happening now is that recent increases in wafer quotes by major manufacturers means there’s going to be a serious impact on the price of actual hardware, including cell phones and a broad range of everyday consumer hardware. It’s believed that this is going to result in more consumers being to buy lower-end hardware.

If you’re not familiar with the role these parts play, modern PCs and smartphones usually contain one or two key chips (CPU, GPU, SoC) made using the most advanced chip tech, like a leading-edge or advanced node. The foundries which make the chips have already increased pricing for their customers. Until recently most chip designers and other firms that make the finished products were hesitant to pass the price hikes on to their customers so entry-level and mainstream products were still agreeable to price-sensitive customers.

Now though the cumulative cost increases for some chips from 2020 to 2022 will be 30% or even more. It’s not possible to avoid passing this increase up the supply chain as margins are already very thin and these companies will not be okay with losing money. The expectation now is that chip designers will increase the prices they charge OEMs, and that will filter down to the end products in 2022.

Bigger BOM Costs

BOM is an acronym for Bill of Materials, and if vendors are going to pass on these higher wafer prices to OEMS then the estimate is that high-end smartphone BOM cost increases will be around 12% for 2022. The average BOM cost for a high-end smartphone is usually around $600. But what’s interesting is the cost for entry-level ones could have their BOM cost affected even more. They could see a 16% increase.

So an anywhere from 12 – 16% increase in BOM cost can create a major impact on a device’s recommended price, and experts say these factors will keep pricing high for years to come. Making chips using leading-edge fabrication technologies like TSMC’s N7 and N5 or Samsung Foundry’s 7LPP and 5LPE is very pricy due to contract chip makers charging up to 3x more for processing wafers using their latest nodes.

Investments in this hardware as part of technology advances are usually made long before those chips start to earn money. It’s for this reason that only a handful of companies in the world can afford leading-edge processes.

It’s also forecasted that over the next few years technologies will remain mostly inaccessible for the majority of chip designers, and even rather advanced chips will still be produced on 16nm and 28nm-class nodes but with 10% to 18% increases in wholesale pricing attached to them.

Demand, and More Demand

The demand seen for all electronics devices is already high than ever these days and emerging and powerful trends like 5G, AI, and HPC all mean that the demand for chips will only get bigger. Experts foresee supply balances not coming around until mid-2023, and adding to all that is the fact that demand for equipment designed for lagging-edge nodes is growing faster than demand for tools aimed at leading-edge nodes. The same nodes that won’t be part of the newer technology chips that major manufacturers are going to be focused on producing.

Adding to this further is that major chip foundries have increased their quotes for 40/45 nm, 55/65 nm, 90 nm, and larger nodes multiple times since mid-2020. This is going to mean that the price of a wafer processed using 90nm technology will increase by 38% in 2022. Again, prices will be passed on to consumers.

The fact that these foundries have utilization rates above 100% nearly all the time means they spend more time processing wafer and less time working on their maintenance too. They will be even more reluctant to drop prices even when demand-supply balance stabilizes.

More Will Go for Entry-Level Devices

Price-sensitive customers who buy higher-end smartphones and PCs may instead choose entry-level devices unless different midrange products appear on the market. The GPU market went through something similar not long ago. This happening with more popular devices like iPhones, Pixels, and Galaxies is quite likely.

This is because the price increases on chips made using mature nodes will affect the end costs attached to all devices. For high-end PCs and smartphones these additional costs won’t affect their recommended prices much at all. But for mainstream devices these additional costs may have a drastic effect on MSRP. Many buyers may feel they have to look past even midrange products and consider buying entry-level instead.

Servers – Why Bare Metal May Be Better


Most people don’t know the workings of what goes into their being able to surf the Internet and visit web pages. That’s perfectly fine unless you’re a developer or something similar. When you click on URL what you’re doing is making a request, and that request is handled by a server. Back in the 1990s when the Internet was in its infancy there were requests being made, but nowhere near the massive numbers of them being made nowadays. This is why servers have been having a lot more asked of them all the time, and sometimes they just don’t have the capacity that’s needed for them.

Need and demand have always been the spurs for innovation, and this is no exception. The aim has been to design servers that have the ability to handle the ever-greater demands on them, and these days the top dog in that regard is a bare metal server. That may sound like a strange name, but they’re called bare metal servers because by being just ‘exposed metal’ they highlight the fully physical aspect of centralized and individual hosting of websites.

That’s because the appeal of bare metal servers is all about ‘single tenancy’ and having the best in performance, reliability and security. It means that your website will be as readily available as possible for visitors at all times, and of course if that site is a key part of your e-commerce business operations then having that performance, reliability, and security is going to be of primary importance for you. Here at 4GoodHosting it should come as no surprise that as Canadian web hosting provider this is the kind of stuff we are very in the know about, so let’s get further into why bare metal tends to be best when it comes to servers.

  1. Better Relative Costs

The performance of on-premises servers and bare metal servers is fairly similar. The biggest cost savings come with datacenter space for hardware as well as data center power costs. These cost savings can be significant, and both offer varying degrees of quality of service. You will pay more upfront for a bare metal server, and that’s because they’re pretty much exclusive to the client in terms of dedicated resources.

What you get for that is unparalleled performance, hardware configurations, and nearly unlimited scalability. The next advantage is that bare metal server providers often offer per-hour billing plans rather than full service contracts paid in advance for the entirety of the term. Bare metal servers may seem pricier, but this is offset by the advantage of paying only for what you use.

  1. More Features for Business

Bare metal servers can be utilized and provide advantages for any business. That’s primarily because of the ability configure them exactly how you want them to be before deploying bare metal servers. That can be done in a range of hardware configurations plus plenty of virtualization environments, but bare metal servers let businesses custom build their own servers with impressive flexibility and customization options. A business that is building bare metal servers can create new operating systems, virtual operating systems, or convert an existing operating system to a virtual environment.

Eliminating hardware redundancy is the next part of the appeal. One example being how a bare metal server can be used to ensure that the server’s power is never turned off, which could mean less in the way of server downtime.

  1. More Agreeable Cost Factors

The biggest determining factor for most when considering a bare metal server will be around the cost for top-end hardware and features. You’ll be evaluating which equipment they need, what features they require, and how much redundancy might still be needed with the hardware. The more you pay for your server the more you’ll have with available cores, powerful hardware configurations, and more available RAM.

One thing to factor into cost savings is the reliability of bare metal servers when it comes to downtime. This reliability comes from the fact that by not sharing the server with other renters, you’re not going to experience the kind of downtime that can cost a company when virtualized servers are the choice.

  1. Software Configurations are Usually Less Pricey

Software configuration is another important part of the equation. A bare metal server configurated with dedicated hardware components and high-end graphics is going to be quite the powerhouse unit. If that can be acquired not-so expensively that’s going to be a huge plus. Businesses will be considering how much they want to dedicate their resources to the maintenance and support of this server. Some companies are really interested in expanding their virtualization and virtual computing space, and a good virtualization platform can make that simpler process.

  1. Bare Metal Server Management and Overhead Costs

The management and general overhead costs for bare metal servers is much the same compared to virtualized servers. This usually isn’t a dissuading factor for decision makers, depending on what the organization wants from its servers.

We can consider how a bare metal server can have faster response times to the network than a virtualized server, something that definitely will be advantageous. The networking team can set up new server builds quickly and can configure the server with the features that they need in the shortest amount of time because a bare metal server can be set up quickly and without making any changes to the virtualization platform.

Ways to Minimize Data Usage with Android OS Devices

There are a lot of people who are decidedly in one camp or another when it comes to iOS or Android smartphones, but it’s probably fair to say that many people go with Android ones because they’re not that savvy about the technology and they like the fact that generally Android phones are less expensive than iPhones. That may change some day, but as of now that’s the way it is and given Apple’s predispositions when it comes to their devise it’s unlikely it will. So if the small green droid is your guy then just go with your inclination.

You’ll be hard pressed to find any more than a few people who have less than a couple of gigs of data on their mobile plans these days, and many people will have 10+ as a minimum if they’re inclined to do more than check mails and use the odd messenger app from time to time. But if you’re an Android user and you’re roaring through your monthly data allowance then you might be interested in what we’re going to share here this week – ways to use less data on your Android device.

It’s something that will be of interest to some and of no interest to others, but here at 4GoodHosting we’re like every other good Canadian web hosting provider in that we know most people choosing us for web hosting in Canada will be being carried by the increasing wave of digitalization that we seemingly al are these days. That means more connectivity, and a need for more data. Or being smarter with the existing data you have, as the case may be.

So let’s get right into it and put you in the know about making your data allowance last longer, even if you paid way less for your Android OS device in the first place. After all, who is made of money these days?

Diagnose Data Usage

The first step is to have an understanding of your data usage each month, and where and how you’re utilizing your data. On older Android versions you can open up the Network and Internet section and tap a line labeled ‘data usage’ before selecting ‘mobile data usage’ on the appearing screen. For newer devices you follow the same path but go a little bit further into ‘app data usage’. If it’s fully updated to Android 12 then you’ll have a gear-shaped icon with your mobile carrier’s name that you click on.

From there, just have a look and if you’re going through much more data than you used to then you can get a definitive look at where most of it is going. Is any of that superfluous stuff that you don’t really need to be doing? This is where you might want to start making some priority changes if you’re not willing or able to add more data to your plan.

Do Away with Background ‘Trickles’

Unnecessary background app activity has always been a cause of diminishing data limits. Social and news apps tend to be the worst in this regard, checking in at regular intervals to prep content delivery if you choose to open them again. If you don’t, that is data wasted. So here’s what you should do – check these apps and look for data-saving options in their settings. One popular choice for certain apps like the Twitter app for Android is to uncheck the ‘sync data’ option that you should be able find quite easily.

And here’s a general rule that everybody should take note of; no matter what you do to change user preferences or anything else of the like, Facebook is an obscene data guzzler and you should really try to limit your time on it with mobile if you’re concerned about using too much data. Save the sifting through posts and the like for when you’re at home and on your Wi-Fi network.

Compress Your Mobile Web Experience

A quick, easy and guaranteed effective way to force browsers to be not so data hungry is to reorient Google’s Chrome browser for Android into its Lite Mode, where pages are routed through Google’s servers so that the pages are compressed before they are presented to you. Here’s how:

  • Go into Chrome setting and look for the line listed as ‘Lite Mode’
  • Tap to activate it, and leave it that way all the time if you’re so inclined

It really is that simple, and estimates are that steady and consistent use of Lite Mode can add up to data savings of up to 60%, and the other benefit is you end up browsing much faster too when you have compressed your mobile web experience.

Advance Downloads of Media

This one might seem very obvious, but mobile streaming will absolutely ruin your data budgeting if you engage in it too often. It is hugely advantageous to download content in advance, and there are plenty of multimedia apps that make that fairly easy. Those who have the user freedoms that come with YouTube Premium or YouTube Music Premium can be proactive here by going to the ‘background and downloads’ section of the app to adjust the setting and by tapping the 3-line menu icon you can find the download button to conduct your downloads while still at home on your Wi-Fi network and then not use even an ounce of data.

Put the Brakes on Play Store

Auto updates can dig into your data too, and the Google Play Store is one of the worst culprits here. Open up the app on your device and select Settings and then Network Preferences. Once you’re there you can choose to have auto updates limited to ‘over Wi-Fi only’ and you can also choose the same for auto-play videos. Highly recommended and is something that’s easily done. If recently updated menu choices with apps, games and the like are important then you can choose ‘ask me every time’ if you want to prevent ongoing auto updates here.

Go with Light Versions

Many services now offer scaled-down versions of apps and sites that you can use or visit without using so much data. Look for Google’s ‘Go’ branded apps here, including Google Go, Google Maps Go, Navigation for Google Maps Go, Gallery Go, Gmail Go, Google Assistant Go, and YouTube Go. All come ready made and will allow a sufficient user experience while not going to hard and heavy on your data allowance.

$1.44 Billion Ready to Go a Long Way for Satellite Internet in Canada

If digital connectivity isn’t an integral part of your life these days then you haven’t been living on earth, or at least anywhere outside of North Korea. What’s kind of ironic is the fact that those folks are kept off the information superhighway entirely for the most part, while their cousins and next door neighbours in South Korea have the best Internet on earth. The logistical challenges that come with a country as large as Canada make those types of internet networks and speeds comparatively impossible, but there’s recently been a major development at the federal level that promises to make quality internet available to more Canadian than ever before.

Here at 4GoodHosting we imagine we’re like most Canadian web hosting providers in that the recent news that the federal government is directing 1.4 billion-plus dollars to Telesat satellite internet is something we NEED to talk about. The very basis of the service we provide is dependent on the web functioning as it should, but it’s been a well known fact for years that people in urban centres enjoy much better connectivity than those in more rural areas of the country. Considering that the web is for so much more than browsing or streaming and is increasingly key to participating in life and personal advancement, that’s not right.

Telesat is a Crown-owned Canadian satellite communication company, and what is in design is their Lightspeed low-earth-orbit (LEO) satellite constellation. The investment is in the development of the first and second generations of satellites, plus establishing domestic employment targets and – in the bigger picture – aiming to reduce costs for regional providers who rely on Telesat products to provide internet connectivity to customers.

This has the potential to be a MAJOR development for people in quieter parts of the country who have been disappointed in the inferior internet they’ve had for more than 20 years now. Previously it was a problem without a real practical solution, but satellite internet is poised to be that solution now.

298-Strong ‘Constellation’

Telesat’s Lightspeed constellation is going to be made up of 298 low earth orbit (LEO) satellites that will be equipped to deliver gigabit speeds with impressive 30 to 50-millisecond latency, and make that available anywhere in the world. The network will run on global priority Ka-band spectrum and boast a capacity of 15 Terabits per second. That would have sounded totally unrealistic even 10 years ago, but here we and isn’t progress a wonderful thing?

In our country the Lightspeed constellation will finally deliver the connectivity the country’s most isolated areas have been looking for over a long, long time now. Internet and cellular services enabled through Telesat will begin service in about 4 years from now and connect 40,000 households in underserved communities. This is perfectly right in line with the government’s previously stated goal of providing high-speed internet to all Canadians by the end of this decade.

Jobs Too

Telesat is also going to invest $3.6 billion in capital expenditure in Canada, and this project’s development and then long-term infrastructure maintenance should provide up to 700 jobs. There is also going to be a focus placed on women in STEM programs when it comes to filling the engineering-related positions.

Partner to Starlink?

SpaceX’s Starlink probably needs no introduction as this game-changer has already been discussed in the news at great length. Starlink is already making its way into Canadian homes, although with limited availability at this point. Starlink launched its testing phase in Canada earlier in 2021, allowing eligible Canadian customers to register for satellite internet subscription. If anyone’s tried it and would care to let us know how it’s been for them, we’d love to hear it.

One big difference between Starlink and Telesat’s Lightspeed will be that Telesat will be making their powerhouse available to regional internet service providers. That is quite different from Starlink, which will sell its service directly to consumers.

It’s also received funding from provincial governments individually. Ontario to the tune of $109 million , and Quebec $200 million plus a separate $200 million investment in the company made in the company by La Belle Province.

Could it be the appetite for genuinely high-speed and reliable internet is stronger in rural Quebec than elsewhere in the country? Who knows, but this is definitely a potentially major development in making it available to all Canadians, no matter where they live.

Avoiding VPNs with Tracker Apps

The appeal of VPNs won’t need much explanation, and as data storage and access needs continue to grow all the time we will see more and more organizations making the move to them. There’s been plenty of times the masses have been told that some new wrinkle in their digital product is innocuous or harmless, but at least some of the time that’s just not the truth. Many VPNs employ tracker apps, and the reasoning given for them is that they are part of offering you a more personalized experience and the like.

That in itself is true in the bigger picture, but in the smaller one for some it may be that tracker apps are actually putting you at risk. Here at 4GoodHosting we’re a Canadian web hosting provider that takes the well being of our customers to heart and given that some of you may the ones making IT decisions for your business, venture, or interest then this is a topic that’s worthy of discussion. The industry consensus very much seems to be that tolerating VPNs with tracker apps is something you shouldn’t be so indifferent to.

But of course, convincing is always required and so with that understood let’s get right to laying out why that’s the consensus, and why a less intrusive VPN may be best for you.

Important Protocols

Most of these types of people will be at least somewhat familiar with protocols and encryption methods used by VPNs. Usually that level of understanding doesn’t make it at all clear as to why VPNs with trackers are creating risks. But let’s start at the start and begin with what a tracker is and what a tracker does. It’s actually a fairly basic explanation – a tracker is something that tracks the choices you make when moving around the Internet. It’s true that most websites and apps use trackers in some way and they’re inclined to follow you nearly everywhere you go.

The information gathered by the trackers about you is then used for targeted advertisements and the like. The trackers are built by software developers at the behest of businesses that want to create greater profits by increasing the chances that like-minded people are made aware of what they have to offer.

1st and 3rd Party Trackers

Understanding the difference between first- and third-party trackers is also helpful in having a better understanding of them, and why they are not completely harmless in the way some people might think they are. The distinction between them is important. The ‘Cookies’ we’ve ALL heard of are examples of first party trackers and used to remember things like your language, layout preferences, and even for saving your shopping cart.

It’s fair to say that cookies are by and large necessary for many websites to give you the type of visitor experience you’re expecting and refusing cookies from being stored is fairly straightforward if you have concerns about their function.

Third-party trackers are entirely different. They’ve been built into websites and apps for the explicit purpose of making money from you. What they are after nearly all of the time is PII – personally identifiable information. Examples could be your IP address, what browser you are using, where you choose to click, long you are on a certain web page, device specs and more. As you’d imagine, this is where most people start to think these trackers are overstepping their bounds.

Free to Profile

And that will also be because the information that’s going to be collected with 3rd party trackers will be used to create a profile for you, and from it comes targeted ads that are delivered to gain business and revenue from you. And yes, Google is ALL about 3rd-party trackers with a ‘more the merrier’ attitude related to having them in place.

A lot of mobile apps will also make use of 3rd-party trackers, and in some ways you need to be even more aware when it comes to using a VPN that implements trackers in their apps. VPN apps that utilize trackers are compromising your privacy to make money, and that’s really the long and short of it. They are not required for the app to function properly and then they are actively leaking your information to Google, Facebook, or whoever else among big data companies.

The extent of the information being collected will vary from app to app. But the use of trackers regardless means information about you is being shared, and this isn’t being communicated to users whatsoever.

More Capable Than You Think

Plenty of these third-party trackers are sophisticated to the point that they have a wide net of data to pull from that and often your IP address isn’t even needed to create a targeted profile for you. These trackers can use the huge amount of information they have and the unique ID for you to connect the dots and still trace everything back to you. It is good to know that even if something as easily traceable as an IP address isn’t being shared, there may still be the ability to connect dots and track the person’s behavior online.

This is why ever greater numbers of decision makers are deciding that a VPN service that is making use of trackers should not be trusted.

We’ll conclude here today by saying that it is possible in some instances to get clarity on what a VPN’s tracker might be getting up to. A good example is the Exodus tool that is very useful for Android-specific information. Plus Apple is putting into place brand-new guidelines for App Store apps and making it mandatory that every single app disclose the information they are collecting, the permissions needed, and also what trackers are being used (if any). These are definitely steps in the right direction if people are in general going to become more trusting of these trackers.

7 Means for a Cooler PC

It’s best to be cool, and we’re not talking at all about ‘being’ anything at all outside of not to being too hot. Anyone who’s ever had no choice but to put a computer through its paces will know what it’s like to hear the cooling fan whirring furiously and failing to make much difference with a PC that’s running too hot, but there are things you can do to try to have yours be cool and have nothing to do with shades and a leather jacket.

CPUs aren’t the only component that get run a little ragged and start feeling the heat. There’s a whole lot of people who are harder on their GPUs than anything else, and if you’re an avid gamer you also probably don’t need to be told what it’s like to have a fan that sounds like it’s seize up and die if it’s made to go any harder.

Here at 4GoodHosting we’re like most good Canadian web hosting providers in that there’s but a few of us here who are real hardware experts. Most are more savvy on the soft side, but even if you’re not the type who could do open-heart surgery on a PC you can still make your PC less likely to overheat and perhaps shut down at the worst time possible.

That’s what we’ll share with our entry this week – 7 ways to keep your work or gaming station from overheating.

Clean out your PC

When we talk about cleaning here, we’re talking about it literally and not meaning getting rid of superfluous files and scripts and the like. Through those vents goes a lot of dust and dirt that ends up settling on the interior components of your PC. That can have negative effects depending on where the bulk of that crud builds up. Pet fur can be a part of it too.

So actually opening up the casing and using a fine brush and / or some isopropyl alcohol to clean up problem spots can be a good idea.

Make Sure Fans are Properly Installed

It doesn’t happen often, but sometimes cooling fans are facing the wrong way and when this happens they are 100% during more harm than good. A fans orientation will have the intake side taking cool air and the exhaust side of the blades dispersing hot air from the unit.

Have a look at yours if you’re having a problem with overheating. If the blades that are facing you curve away from you, then they’re they way they should be. If the blades facing you are curving towards you then somebody messed up assembling or re-assembling the computer and you’ve got a very simple but promising repair job to do.

You should also confirm the configuration works well for airflow. Try to aim for a slightly positive pressure setup, and it’s something you can find out more about online with a simple Google search.

Repaste the CPU Cooler

Having old thermal paste around your CPU cooler is pretty common, especially if yours is an older model and you’ve never had any reason or inclination to open it up and do basic maintenance. But if your computer is overheating now then maybe it’s time to give this a try and its also not particularly difficult.

Redoing thermal paste can improve a CPU’s temperatures and repasting can also fix wonky applications for a brand-new build too. All you need to do is buy a tube of Arctic Silver 5 or something similar. Carefully scrape away the remnants of the existing paste and apply a new coating of it. It’s possible to also repaste on a GPU but it’s more challenging than doing it on a CPU.

Add Additional Fans

If one is not getting it done for you, you can opt to add more fans to your CPU or GPU to cool it down more effectively. A good choice is to start with additional case fans, which tend to be the cheapest and usually not to much is required to work them into existing CPU housings. Many people choose to have two fans at the front.

Upgrade the CPU Cooler

Generally speaking, beefier 3rd-party model fans are going to perform better than the stock ones that came with your PC. Dropping your processor’s temperature is made easy by upgrading to a third-party CPU cooler much of the time.

Another option is to go with a closed-loop cooler, but it’s only really necessary when RAM clearance is an issue for you or you have looks considerations.

Go with a Mesh Front Case

Mesh-front cases are really popular right now, and switching yours out and going with one of these is also fairly basic. They look different and they work much better for ventilating against heat buildup. A cooling fan upgrade and a mesh front case may make a major difference in your ability to stop your desktop from overheating.

The last suggestion we’ll make is a minor one so it won’t get its own subhead. Most computers are perfectly fine with going dormant or to sleep when they’re not in use rather than being shut down. But shutting down at least somewhat regularly is better for general prevention of CPU and GPU heating. It’s a good idea to let yours go off completely every once in a while.