Easy Cloud Access May Increase Data Security Risk

It’s been said many times that you can’t stop progress, and that’s true to the point that it may be one of the more applicable maxims around these days. Especially when it comes to technology, as there’s no way any degree of stepping backwards is going to be tolerated if advances mean real benefits. Acronyms are a challenge for many, but even if you have the slightest amount of digital savvy you’ll know that SaaS stands for Software as a Service and its one of many examples where cloud computing technology has made the hassles of hardware installation a thing of the past.

Here at 4GoodHosting we’ve had firsthand benefits from the Cloud and how it’s removed the need for a lot of physical hardware and infrastructure is something any Canadian web hosting provider will be able to relate to. As a collective user base we’re certainly not going to approve of any regression here either, but more and more we’re learning how there are security risks related to cloud infrastructure. That’s not news, and the fact that ease of access increases that risk probably doesn’t come as a surprise either.

But that’s the truth of the situation, and it’s something worth looking into, especially as businesses are flocking to software-as-a-service applications with the aim of improving the efficiency of their operations and overall employee productivity. The question is though – is weak control of access to cloud apps putting those organizations’ data at risk?

1.5x Exposure on Average

There was a recent study that showed that the average 1,000-person company using certain SaaS apps is likely exposing data to anywhere from 1,000 and 15,000 external collaborators. Similar estimates from it suggested between hundreds of companies if not more would also have access to a company’s data, and around 20% of a typical business and their SaaS files might be available for internal sharing with little more than the click of a link.

What can be taken away from that is that unmanageable SaaS data access is a legit problem that can apply to businesses of any size these days.

Last year, slightly more than 40% of data breaches occurred as the result of web application vulnerabilities according to this report. Nearly half of all data breaches can be attributed to SaaS applications, and seeing as how more and more businesses rely on these softwares, it is legitimately a huge threat. Especially when you consider that many companies store anywhere from 500k to a million assets in SaaS applications.

This looks to be even more of a problem in the future. The incorporation of SaaS services is predicted to grow, with revenues expected to jump a full 30% over the next 3+ years to 2025.

COVID Factor

This growth has and will continue to be accelerated by the new working realities the COVID pandemic has created for us. This is because SaaS application are easy to set up and don’t require the same outlay of time and resources for an IT department. The way businesses can identify problems and procure solutions on their own and within a timeframe that works for them is a huge plus.

Add to that as well the shift to working remotely for so many people and having the ability to access a SaaS from anywhere and on any device is something that is going to be pushing the appeal of Software as a Service for a long time yet to come. And in the bigger picture that is definitely a good thing.

This goes along with massive increases in the adoption of cloud services, choices made for all the same reasons and a similar part of the new digital workplace reality for a lot of people. Many organizations that had this shift in mind had their timetable accelerated because of the pandemic and the new need for the ability to have team members working remotely.

Software Visibility Gap

In the early 2000s there was a trend where free and small-scale SaaS offerings were still something of an unknown but at the most basic level they were very agreeable because they met needs very well and offered more speed and agility compared to conventional and standard options. They often really improved business results, and that’s why they took off from there.

But since then the meteoric growth in adoption has introduced problems, and in many ways they were ones that industry experts foresaw – even back then. Unmanaged assets will always pose some degree of risk, and by making it so that ease of access is expected from the user base they’ve also created the possibility of greater data insecurity.

This is what creates a software visibility gap, with the cloud obfuscating the inner workings of the applications and the data stored in it and blurring the insight into potential attacks to the point that security measures can’t be validated for effectiveness in application the same way.

Problems with Data Everywhere

Cloud and SaaS platforms as they exist for the most part today make it so that the corporate network is no longer the only way to access data, and access gained through 3rd-party apps, IoT devices in the home, and portals created for external users like customers, partners, contractors and MSPs make security a much more complicated and challenging process.

It’s perfectly natural that companies are eager to use these access points to increase the functionality of their cloud and SaaS systems but going in full bore without understanding how secure and monitor them in the same way may lead to major access vulnerabilities that are beyond the capacity of the organization to identify and prepare against.

It’s entirely true that unmanaged SaaS usage means that sensitive corporate data may make its out of the house and do so long before those in charge of security become aware of the extent of the problem and what they might do to minimize the damage done.

When we consider further that SaaS applications often integrate with other SaaS applications the risk is magnified even further.

Responses in Progress

Organizations are making an effort to reduce the risk posed to their data by SaaS apps without stifling speed, creativity and business success, but it’s not an easy fix at this point by any means. Security and IT teams cant’ depend exclusively on in-house expertise to have the security measures they need in place in a timely manner. Or at all. With increasing complexity of cloud and SaaS environments companies will need to use automated tools to ensure that their security settings are in line with business intent, along with continuous monitoring of security controls to prevent configuration drift.

Device Prices Set to Go Up Due to Chip Shortage

Anyone who knows of a quality smartphone that checks all the boxes and comes with an agreeable price tag can speak up and volunteer that information now. A good one that’s not going to be obsolete within a year or two is going to cost you, and it would seem that given recent worldly developments in the digital hardware sphere it might be that even the less expensive smartphones, laptops, and desktop computers are going to be going up in price quite a bit too. We’re entering an inflationary period in North America right now, but that’s not why prices on devices are shooting up.

It’s mostly related to how international chip makers are hamstrung in their ability to make the semiconductor chips in the same quantities they made them for years. Take a look at any of the very newest smartphones on the market and your enthusiasm for them based on the features is quickly slowed when you see how much they cost. If you’re a person who’s fine with a more standard and ordinary device this trend isn’t going to bother you too much, but if you’re all about the latest and greatest in technology – be prepared to pay quite a bit more for it.

There’s no getting around the basic principle of supply and demand with pretty much any consumer product in the world. It turns out this applies to components too, and when it comes to what enables these devices to work their magic the demand is now outdistancing the supply of them like never before. Any Canadian web hosting provider like us here at 4GoodHosting have our own operating constraints related to demand outstripping supply too, but it’s different when it’s the individual consumer who’s faced with the prospect of paying a LOT more when it’s time to upgrade or replace.

Wafers Wanted

Wafers aren’t only snacks, and in fact they are integral part of the chips that are so needed by mobile and computing device manufacturers these days. What’s happening now is that recent increases in wafer quotes by major manufacturers means there’s going to be a serious impact on the price of actual hardware, including cell phones and a broad range of everyday consumer hardware. It’s believed that this is going to result in more consumers being to buy lower-end hardware.

If you’re not familiar with the role these parts play, modern PCs and smartphones usually contain one or two key chips (CPU, GPU, SoC) made using the most advanced chip tech, like a leading-edge or advanced node. The foundries which make the chips have already increased pricing for their customers. Until recently most chip designers and other firms that make the finished products were hesitant to pass the price hikes on to their customers so entry-level and mainstream products were still agreeable to price-sensitive customers.

Now though the cumulative cost increases for some chips from 2020 to 2022 will be 30% or even more. It’s not possible to avoid passing this increase up the supply chain as margins are already very thin and these companies will not be okay with losing money. The expectation now is that chip designers will increase the prices they charge OEMs, and that will filter down to the end products in 2022.

Bigger BOM Costs

BOM is an acronym for Bill of Materials, and if vendors are going to pass on these higher wafer prices to OEMS then the estimate is that high-end smartphone BOM cost increases will be around 12% for 2022. The average BOM cost for a high-end smartphone is usually around $600. But what’s interesting is the cost for entry-level ones could have their BOM cost affected even more. They could see a 16% increase.

So an anywhere from 12 – 16% increase in BOM cost can create a major impact on a device’s recommended price, and experts say these factors will keep pricing high for years to come. Making chips using leading-edge fabrication technologies like TSMC’s N7 and N5 or Samsung Foundry’s 7LPP and 5LPE is very pricy due to contract chip makers charging up to 3x more for processing wafers using their latest nodes.

Investments in this hardware as part of technology advances are usually made long before those chips start to earn money. It’s for this reason that only a handful of companies in the world can afford leading-edge processes.

It’s also forecasted that over the next few years technologies will remain mostly inaccessible for the majority of chip designers, and even rather advanced chips will still be produced on 16nm and 28nm-class nodes but with 10% to 18% increases in wholesale pricing attached to them.

Demand, and More Demand

The demand seen for all electronics devices is already high than ever these days and emerging and powerful trends like 5G, AI, and HPC all mean that the demand for chips will only get bigger. Experts foresee supply balances not coming around until mid-2023, and adding to all that is the fact that demand for equipment designed for lagging-edge nodes is growing faster than demand for tools aimed at leading-edge nodes. The same nodes that won’t be part of the newer technology chips that major manufacturers are going to be focused on producing.

Adding to this further is that major chip foundries have increased their quotes for 40/45 nm, 55/65 nm, 90 nm, and larger nodes multiple times since mid-2020. This is going to mean that the price of a wafer processed using 90nm technology will increase by 38% in 2022. Again, prices will be passed on to consumers.

The fact that these foundries have utilization rates above 100% nearly all the time means they spend more time processing wafer and less time working on their maintenance too. They will be even more reluctant to drop prices even when demand-supply balance stabilizes.

More Will Go for Entry-Level Devices

Price-sensitive customers who buy higher-end smartphones and PCs may instead choose entry-level devices unless different midrange products appear on the market. The GPU market went through something similar not long ago. This happening with more popular devices like iPhones, Pixels, and Galaxies is quite likely.

This is because the price increases on chips made using mature nodes will affect the end costs attached to all devices. For high-end PCs and smartphones these additional costs won’t affect their recommended prices much at all. But for mainstream devices these additional costs may have a drastic effect on MSRP. Many buyers may feel they have to look past even midrange products and consider buying entry-level instead.

Servers – Why Bare Metal May Be Better


Most people don’t know the workings of what goes into their being able to surf the Internet and visit web pages. That’s perfectly fine unless you’re a developer or something similar. When you click on URL what you’re doing is making a request, and that request is handled by a server. Back in the 1990s when the Internet was in its infancy there were requests being made, but nowhere near the massive numbers of them being made nowadays. This is why servers have been having a lot more asked of them all the time, and sometimes they just don’t have the capacity that’s needed for them.

Need and demand have always been the spurs for innovation, and this is no exception. The aim has been to design servers that have the ability to handle the ever-greater demands on them, and these days the top dog in that regard is a bare metal server. That may sound like a strange name, but they’re called bare metal servers because by being just ‘exposed metal’ they highlight the fully physical aspect of centralized and individual hosting of websites.

That’s because the appeal of bare metal servers is all about ‘single tenancy’ and having the best in performance, reliability and security. It means that your website will be as readily available as possible for visitors at all times, and of course if that site is a key part of your e-commerce business operations then having that performance, reliability, and security is going to be of primary importance for you. Here at 4GoodHosting it should come as no surprise that as Canadian web hosting provider this is the kind of stuff we are very in the know about, so let’s get further into why bare metal tends to be best when it comes to servers.

  1. Better Relative Costs

The performance of on-premises servers and bare metal servers is fairly similar. The biggest cost savings come with datacenter space for hardware as well as data center power costs. These cost savings can be significant, and both offer varying degrees of quality of service. You will pay more upfront for a bare metal server, and that’s because they’re pretty much exclusive to the client in terms of dedicated resources.

What you get for that is unparalleled performance, hardware configurations, and nearly unlimited scalability. The next advantage is that bare metal server providers often offer per-hour billing plans rather than full service contracts paid in advance for the entirety of the term. Bare metal servers may seem pricier, but this is offset by the advantage of paying only for what you use.

  1. More Features for Business

Bare metal servers can be utilized and provide advantages for any business. That’s primarily because of the ability configure them exactly how you want them to be before deploying bare metal servers. That can be done in a range of hardware configurations plus plenty of virtualization environments, but bare metal servers let businesses custom build their own servers with impressive flexibility and customization options. A business that is building bare metal servers can create new operating systems, virtual operating systems, or convert an existing operating system to a virtual environment.

Eliminating hardware redundancy is the next part of the appeal. One example being how a bare metal server can be used to ensure that the server’s power is never turned off, which could mean less in the way of server downtime.

  1. More Agreeable Cost Factors

The biggest determining factor for most when considering a bare metal server will be around the cost for top-end hardware and features. You’ll be evaluating which equipment they need, what features they require, and how much redundancy might still be needed with the hardware. The more you pay for your server the more you’ll have with available cores, powerful hardware configurations, and more available RAM.

One thing to factor into cost savings is the reliability of bare metal servers when it comes to downtime. This reliability comes from the fact that by not sharing the server with other renters, you’re not going to experience the kind of downtime that can cost a company when virtualized servers are the choice.

  1. Software Configurations are Usually Less Pricey

Software configuration is another important part of the equation. A bare metal server configurated with dedicated hardware components and high-end graphics is going to be quite the powerhouse unit. If that can be acquired not-so expensively that’s going to be a huge plus. Businesses will be considering how much they want to dedicate their resources to the maintenance and support of this server. Some companies are really interested in expanding their virtualization and virtual computing space, and a good virtualization platform can make that simpler process.

  1. Bare Metal Server Management and Overhead Costs

The management and general overhead costs for bare metal servers is much the same compared to virtualized servers. This usually isn’t a dissuading factor for decision makers, depending on what the organization wants from its servers.

We can consider how a bare metal server can have faster response times to the network than a virtualized server, something that definitely will be advantageous. The networking team can set up new server builds quickly and can configure the server with the features that they need in the shortest amount of time because a bare metal server can be set up quickly and without making any changes to the virtualization platform.

Ways to Minimize Data Usage with Android OS Devices

There are a lot of people who are decidedly in one camp or another when it comes to iOS or Android smartphones, but it’s probably fair to say that many people go with Android ones because they’re not that savvy about the technology and they like the fact that generally Android phones are less expensive than iPhones. That may change some day, but as of now that’s the way it is and given Apple’s predispositions when it comes to their devise it’s unlikely it will. So if the small green droid is your guy then just go with your inclination.

You’ll be hard pressed to find any more than a few people who have less than a couple of gigs of data on their mobile plans these days, and many people will have 10+ as a minimum if they’re inclined to do more than check mails and use the odd messenger app from time to time. But if you’re an Android user and you’re roaring through your monthly data allowance then you might be interested in what we’re going to share here this week – ways to use less data on your Android device.

It’s something that will be of interest to some and of no interest to others, but here at 4GoodHosting we’re like every other good Canadian web hosting provider in that we know most people choosing us for web hosting in Canada will be being carried by the increasing wave of digitalization that we seemingly al are these days. That means more connectivity, and a need for more data. Or being smarter with the existing data you have, as the case may be.

So let’s get right into it and put you in the know about making your data allowance last longer, even if you paid way less for your Android OS device in the first place. After all, who is made of money these days?

Diagnose Data Usage

The first step is to have an understanding of your data usage each month, and where and how you’re utilizing your data. On older Android versions you can open up the Network and Internet section and tap a line labeled ‘data usage’ before selecting ‘mobile data usage’ on the appearing screen. For newer devices you follow the same path but go a little bit further into ‘app data usage’. If it’s fully updated to Android 12 then you’ll have a gear-shaped icon with your mobile carrier’s name that you click on.

From there, just have a look and if you’re going through much more data than you used to then you can get a definitive look at where most of it is going. Is any of that superfluous stuff that you don’t really need to be doing? This is where you might want to start making some priority changes if you’re not willing or able to add more data to your plan.

Do Away with Background ‘Trickles’

Unnecessary background app activity has always been a cause of diminishing data limits. Social and news apps tend to be the worst in this regard, checking in at regular intervals to prep content delivery if you choose to open them again. If you don’t, that is data wasted. So here’s what you should do – check these apps and look for data-saving options in their settings. One popular choice for certain apps like the Twitter app for Android is to uncheck the ‘sync data’ option that you should be able find quite easily.

And here’s a general rule that everybody should take note of; no matter what you do to change user preferences or anything else of the like, Facebook is an obscene data guzzler and you should really try to limit your time on it with mobile if you’re concerned about using too much data. Save the sifting through posts and the like for when you’re at home and on your Wi-Fi network.

Compress Your Mobile Web Experience

A quick, easy and guaranteed effective way to force browsers to be not so data hungry is to reorient Google’s Chrome browser for Android into its Lite Mode, where pages are routed through Google’s servers so that the pages are compressed before they are presented to you. Here’s how:

  • Go into Chrome setting and look for the line listed as ‘Lite Mode’
  • Tap to activate it, and leave it that way all the time if you’re so inclined

It really is that simple, and estimates are that steady and consistent use of Lite Mode can add up to data savings of up to 60%, and the other benefit is you end up browsing much faster too when you have compressed your mobile web experience.

Advance Downloads of Media

This one might seem very obvious, but mobile streaming will absolutely ruin your data budgeting if you engage in it too often. It is hugely advantageous to download content in advance, and there are plenty of multimedia apps that make that fairly easy. Those who have the user freedoms that come with YouTube Premium or YouTube Music Premium can be proactive here by going to the ‘background and downloads’ section of the app to adjust the setting and by tapping the 3-line menu icon you can find the download button to conduct your downloads while still at home on your Wi-Fi network and then not use even an ounce of data.

Put the Brakes on Play Store

Auto updates can dig into your data too, and the Google Play Store is one of the worst culprits here. Open up the app on your device and select Settings and then Network Preferences. Once you’re there you can choose to have auto updates limited to ‘over Wi-Fi only’ and you can also choose the same for auto-play videos. Highly recommended and is something that’s easily done. If recently updated menu choices with apps, games and the like are important then you can choose ‘ask me every time’ if you want to prevent ongoing auto updates here.

Go with Light Versions

Many services now offer scaled-down versions of apps and sites that you can use or visit without using so much data. Look for Google’s ‘Go’ branded apps here, including Google Go, Google Maps Go, Navigation for Google Maps Go, Gallery Go, Gmail Go, Google Assistant Go, and YouTube Go. All come ready made and will allow a sufficient user experience while not going to hard and heavy on your data allowance.

$1.44 Billion Ready to Go a Long Way for Satellite Internet in Canada

If digital connectivity isn’t an integral part of your life these days then you haven’t been living on earth, or at least anywhere outside of North Korea. What’s kind of ironic is the fact that those folks are kept off the information superhighway entirely for the most part, while their cousins and next door neighbours in South Korea have the best Internet on earth. The logistical challenges that come with a country as large as Canada make those types of internet networks and speeds comparatively impossible, but there’s recently been a major development at the federal level that promises to make quality internet available to more Canadian than ever before.

Here at 4GoodHosting we imagine we’re like most Canadian web hosting providers in that the recent news that the federal government is directing 1.4 billion-plus dollars to Telesat satellite internet is something we NEED to talk about. The very basis of the service we provide is dependent on the web functioning as it should, but it’s been a well known fact for years that people in urban centres enjoy much better connectivity than those in more rural areas of the country. Considering that the web is for so much more than browsing or streaming and is increasingly key to participating in life and personal advancement, that’s not right.

Telesat is a Crown-owned Canadian satellite communication company, and what is in design is their Lightspeed low-earth-orbit (LEO) satellite constellation. The investment is in the development of the first and second generations of satellites, plus establishing domestic employment targets and – in the bigger picture – aiming to reduce costs for regional providers who rely on Telesat products to provide internet connectivity to customers.

This has the potential to be a MAJOR development for people in quieter parts of the country who have been disappointed in the inferior internet they’ve had for more than 20 years now. Previously it was a problem without a real practical solution, but satellite internet is poised to be that solution now.

298-Strong ‘Constellation’

Telesat’s Lightspeed constellation is going to be made up of 298 low earth orbit (LEO) satellites that will be equipped to deliver gigabit speeds with impressive 30 to 50-millisecond latency, and make that available anywhere in the world. The network will run on global priority Ka-band spectrum and boast a capacity of 15 Terabits per second. That would have sounded totally unrealistic even 10 years ago, but here we and isn’t progress a wonderful thing?

In our country the Lightspeed constellation will finally deliver the connectivity the country’s most isolated areas have been looking for over a long, long time now. Internet and cellular services enabled through Telesat will begin service in about 4 years from now and connect 40,000 households in underserved communities. This is perfectly right in line with the government’s previously stated goal of providing high-speed internet to all Canadians by the end of this decade.

Jobs Too

Telesat is also going to invest $3.6 billion in capital expenditure in Canada, and this project’s development and then long-term infrastructure maintenance should provide up to 700 jobs. There is also going to be a focus placed on women in STEM programs when it comes to filling the engineering-related positions.

Partner to Starlink?

SpaceX’s Starlink probably needs no introduction as this game-changer has already been discussed in the news at great length. Starlink is already making its way into Canadian homes, although with limited availability at this point. Starlink launched its testing phase in Canada earlier in 2021, allowing eligible Canadian customers to register for satellite internet subscription. If anyone’s tried it and would care to let us know how it’s been for them, we’d love to hear it.

One big difference between Starlink and Telesat’s Lightspeed will be that Telesat will be making their powerhouse available to regional internet service providers. That is quite different from Starlink, which will sell its service directly to consumers.

It’s also received funding from provincial governments individually. Ontario to the tune of $109 million , and Quebec $200 million plus a separate $200 million investment in the company made in the company by La Belle Province.

Could it be the appetite for genuinely high-speed and reliable internet is stronger in rural Quebec than elsewhere in the country? Who knows, but this is definitely a potentially major development in making it available to all Canadians, no matter where they live.

Avoiding VPNs with Tracker Apps

The appeal of VPNs won’t need much explanation, and as data storage and access needs continue to grow all the time we will see more and more organizations making the move to them. There’s been plenty of times the masses have been told that some new wrinkle in their digital product is innocuous or harmless, but at least some of the time that’s just not the truth. Many VPNs employ tracker apps, and the reasoning given for them is that they are part of offering you a more personalized experience and the like.

That in itself is true in the bigger picture, but in the smaller one for some it may be that tracker apps are actually putting you at risk. Here at 4GoodHosting we’re a Canadian web hosting provider that takes the well being of our customers to heart and given that some of you may the ones making IT decisions for your business, venture, or interest then this is a topic that’s worthy of discussion. The industry consensus very much seems to be that tolerating VPNs with tracker apps is something you shouldn’t be so indifferent to.

But of course, convincing is always required and so with that understood let’s get right to laying out why that’s the consensus, and why a less intrusive VPN may be best for you.

Important Protocols

Most of these types of people will be at least somewhat familiar with protocols and encryption methods used by VPNs. Usually that level of understanding doesn’t make it at all clear as to why VPNs with trackers are creating risks. But let’s start at the start and begin with what a tracker is and what a tracker does. It’s actually a fairly basic explanation – a tracker is something that tracks the choices you make when moving around the Internet. It’s true that most websites and apps use trackers in some way and they’re inclined to follow you nearly everywhere you go.

The information gathered by the trackers about you is then used for targeted advertisements and the like. The trackers are built by software developers at the behest of businesses that want to create greater profits by increasing the chances that like-minded people are made aware of what they have to offer.

1st and 3rd Party Trackers

Understanding the difference between first- and third-party trackers is also helpful in having a better understanding of them, and why they are not completely harmless in the way some people might think they are. The distinction between them is important. The ‘Cookies’ we’ve ALL heard of are examples of first party trackers and used to remember things like your language, layout preferences, and even for saving your shopping cart.

It’s fair to say that cookies are by and large necessary for many websites to give you the type of visitor experience you’re expecting and refusing cookies from being stored is fairly straightforward if you have concerns about their function.

Third-party trackers are entirely different. They’ve been built into websites and apps for the explicit purpose of making money from you. What they are after nearly all of the time is PII – personally identifiable information. Examples could be your IP address, what browser you are using, where you choose to click, long you are on a certain web page, device specs and more. As you’d imagine, this is where most people start to think these trackers are overstepping their bounds.

Free to Profile

And that will also be because the information that’s going to be collected with 3rd party trackers will be used to create a profile for you, and from it comes targeted ads that are delivered to gain business and revenue from you. And yes, Google is ALL about 3rd-party trackers with a ‘more the merrier’ attitude related to having them in place.

A lot of mobile apps will also make use of 3rd-party trackers, and in some ways you need to be even more aware when it comes to using a VPN that implements trackers in their apps. VPN apps that utilize trackers are compromising your privacy to make money, and that’s really the long and short of it. They are not required for the app to function properly and then they are actively leaking your information to Google, Facebook, or whoever else among big data companies.

The extent of the information being collected will vary from app to app. But the use of trackers regardless means information about you is being shared, and this isn’t being communicated to users whatsoever.

More Capable Than You Think

Plenty of these third-party trackers are sophisticated to the point that they have a wide net of data to pull from that and often your IP address isn’t even needed to create a targeted profile for you. These trackers can use the huge amount of information they have and the unique ID for you to connect the dots and still trace everything back to you. It is good to know that even if something as easily traceable as an IP address isn’t being shared, there may still be the ability to connect dots and track the person’s behavior online.

This is why ever greater numbers of decision makers are deciding that a VPN service that is making use of trackers should not be trusted.

We’ll conclude here today by saying that it is possible in some instances to get clarity on what a VPN’s tracker might be getting up to. A good example is the Exodus tool that is very useful for Android-specific information. Plus Apple is putting into place brand-new guidelines for App Store apps and making it mandatory that every single app disclose the information they are collecting, the permissions needed, and also what trackers are being used (if any). These are definitely steps in the right direction if people are in general going to become more trusting of these trackers.

7 Means for a Cooler PC

It’s best to be cool, and we’re not talking at all about ‘being’ anything at all outside of not to being too hot. Anyone who’s ever had no choice but to put a computer through its paces will know what it’s like to hear the cooling fan whirring furiously and failing to make much difference with a PC that’s running too hot, but there are things you can do to try to have yours be cool and have nothing to do with shades and a leather jacket.

CPUs aren’t the only component that get run a little ragged and start feeling the heat. There’s a whole lot of people who are harder on their GPUs than anything else, and if you’re an avid gamer you also probably don’t need to be told what it’s like to have a fan that sounds like it’s seize up and die if it’s made to go any harder.

Here at 4GoodHosting we’re like most good Canadian web hosting providers in that there’s but a few of us here who are real hardware experts. Most are more savvy on the soft side, but even if you’re not the type who could do open-heart surgery on a PC you can still make your PC less likely to overheat and perhaps shut down at the worst time possible.

That’s what we’ll share with our entry this week – 7 ways to keep your work or gaming station from overheating.

Clean out your PC

When we talk about cleaning here, we’re talking about it literally and not meaning getting rid of superfluous files and scripts and the like. Through those vents goes a lot of dust and dirt that ends up settling on the interior components of your PC. That can have negative effects depending on where the bulk of that crud builds up. Pet fur can be a part of it too.

So actually opening up the casing and using a fine brush and / or some isopropyl alcohol to clean up problem spots can be a good idea.

Make Sure Fans are Properly Installed

It doesn’t happen often, but sometimes cooling fans are facing the wrong way and when this happens they are 100% during more harm than good. A fans orientation will have the intake side taking cool air and the exhaust side of the blades dispersing hot air from the unit.

Have a look at yours if you’re having a problem with overheating. If the blades that are facing you curve away from you, then they’re they way they should be. If the blades facing you are curving towards you then somebody messed up assembling or re-assembling the computer and you’ve got a very simple but promising repair job to do.

You should also confirm the configuration works well for airflow. Try to aim for a slightly positive pressure setup, and it’s something you can find out more about online with a simple Google search.

Repaste the CPU Cooler

Having old thermal paste around your CPU cooler is pretty common, especially if yours is an older model and you’ve never had any reason or inclination to open it up and do basic maintenance. But if your computer is overheating now then maybe it’s time to give this a try and its also not particularly difficult.

Redoing thermal paste can improve a CPU’s temperatures and repasting can also fix wonky applications for a brand-new build too. All you need to do is buy a tube of Arctic Silver 5 or something similar. Carefully scrape away the remnants of the existing paste and apply a new coating of it. It’s possible to also repaste on a GPU but it’s more challenging than doing it on a CPU.

Add Additional Fans

If one is not getting it done for you, you can opt to add more fans to your CPU or GPU to cool it down more effectively. A good choice is to start with additional case fans, which tend to be the cheapest and usually not to much is required to work them into existing CPU housings. Many people choose to have two fans at the front.

Upgrade the CPU Cooler

Generally speaking, beefier 3rd-party model fans are going to perform better than the stock ones that came with your PC. Dropping your processor’s temperature is made easy by upgrading to a third-party CPU cooler much of the time.

Another option is to go with a closed-loop cooler, but it’s only really necessary when RAM clearance is an issue for you or you have looks considerations.

Go with a Mesh Front Case

Mesh-front cases are really popular right now, and switching yours out and going with one of these is also fairly basic. They look different and they work much better for ventilating against heat buildup. A cooling fan upgrade and a mesh front case may make a major difference in your ability to stop your desktop from overheating.

The last suggestion we’ll make is a minor one so it won’t get its own subhead. Most computers are perfectly fine with going dormant or to sleep when they’re not in use rather than being shut down. But shutting down at least somewhat regularly is better for general prevention of CPU and GPU heating. It’s a good idea to let yours go off completely every once in a while.

Quantum Computing Goes Desktop

To say much has been made of the potential of quantum computing would be a big understatement. We’re really only just scratching the surface of what its reach can be, and it holds so much promise for improving our lives along with major contributions to the efficiency of business. And nowadays quantum computing is coming on leaps and bounds. Previously the capacity of it meant that the physical hardware was expansive and the farthest thing from portable, but now that’s changed and having quantum computing go desktop is a big development.

This has the potential to reach into every industry and interest, and here at 4GoodHosting we are like any quality Canadian web hosting provider in that it has the potential to revise our landscape and those of the people who choose us to host their websites. Computing industry experts are calling an operating system available on a chip to be a ‘sensational breakthrough’

About 50 quantum computers have been built up to this point, and all of them use different software, as a quantum equivalent of Windows, IOS or Linux doesn’t exist. But this new development means an OS enabling the same quantum software to run on different quantum computing hardware types.

Let It Flow

The system has been named Deltaflow.OS and runs on a chip developed by consortium member SEEQC using almost nothing of the space required with previous hardware. The chip is about the size of a coin, and has all the same power and capacities as previous versions that were much larger. This new quantum computing chip is said to be about the size of a coin, and its relevance for the future of quantum computers is huge, especially as it look like they can be produced cost-effectively and at scale.

A little bit of explanation may be required here – quantum computers store information in the form of quantum bits, or ‘qubits’ as they are called. Qubits can exist in a pair of different information states at the same time. Being truly powerful requires scaling up to include many more qubits in order for it to make solving seriously challenging problems possible. Racks full of electronics were required to control Qubits previously, but now it’s all able to flow from a chip.

Grand Vision

The long-term goal is to have an operating system that makes quantum software portable across qubit technologies – scalable to millions of qubits. Part of that will be teasing the highest possible performance out of every qubit, and that will apply to applications like error correction that require fast feedback loops too.

The next question then is what will quantum computing be used for, and what are some specific benefit areas? A sufficient supply of qubits will allow quantum computers to process complex calculations at very high speeds, and so there is very real application for chemical testing without the use of a physical lab. As just one example.

What this entails is taking that vast processing power and using it to simulate digital versions of chemical compounds, test theories and predict chemical reactions without needing a physical lab and staff going through the processes of the tests. What this could do for the pharmaceutical industry is huge, especially when you consider it takes about $1 billion dollars to bring a major big-ticket new drug to market after many years of research, tests, and clinical trials. Quantum computing could speed this up and reduce research and development costs in a big way.

Better Batteries

If humans around the world are to achieve their carbon-neutral aims then the large scale switch to EV vehicles is going to mean the need for better batteries, and a lot of them. In much the same way the speed and reach of quantum computing can aid in drug development, the same virtual lab environment created by these computers may enable a much faster, less expensive, and more robust way to screen battery materials. Leading to improved research and development towards a cleaner future.

We can expect to see quantum developments in logistics, weather prediction, cybersecurity, and finance too. The technology will evolve in step with firmware developments for quantum processors that will later interface with Deltaflow.OS. There’s also something of a contest to see who will be first to transform quantum computers from experimental technology into commercial products. This is being referred to as the ‘quantum advantage’ and that’s a term you may be hearing a lot more of over the next little while.

Multi-Cloud Strategies to Dominate Future

Cloud computing is now nearly ubiquitous with its role in digital operations for all businesses, and what it’s done to replace the need for physical storage and a whole host of other benefits has obviously been a great development. It’s not often that a technological advance of this kind works so well when still in its infancy, but in fairness the cloud is something that was a natural outgrowth of many other pivotal gains in computing technology coming together and brining in other key components.

Evolution is always a part of these developments too, and when a new way of ‘doing thing’s is so eagerly adopted by so many people those evolutions tend to come fairly fast and furious. Here at 4GoodHosting we’re like every good Canadian web hosting provider in that we’re a little more front row than others for watching these types of major advances as they take shape, and some of us also have the web and computer savvy to have a better handle on what it all means and how it promises to add positive changes to the digital work world.

Being agile is an absolute necessity in the IT world, and especially considering the roadmap is always changing. Not to say that full one-80s are often required, but sometimes there is a need to plant your foot in the grass and pivot hard. Revisiting cloud infrastructure is always going to be a part of this, and more often than not it’s caused by workloads increasing massively in size overnight. That’s a term used loosely, and while it’s not exactly overnight the message is that big change requirements can come around surprisingly quickly and you need to be able to pivot and rechart without inflexibility.

At the forefront of all of this is a trend where multi-cloud strategies are seen as ideal fits for businesses, and that’s what we’ll look at here today.

Clearer Appeal

What is being seen now is multi-Cloud strategies emerging as a dominant part of many organizations’ long-term IT roadmaps. A recent study conducted by an IT services firm in the US and UK came back with some interesting findings regarding what’s to be in the near future for multi-cloud. The report paints a clear picture of the industry and makes clear how companies’ investments in cloud services are different than they were not even so long ago.

What was interesting to note first was the nearly half of respondents were indicating they have chosen a new provider in the last year, and this shows that shares of the cloud market are very much up for grabs between those major providers. What needs to be a primary focus for organisations right now is investing in the correct cloud strategy for their unique IT workloads.

Standard interests like pricing or vendor incentives, security and technical requirements are the top drivers when it comes to decision making related to cloud infrastructure, and it’s among these metrics that the bulk of decisions are made around what is going to serve the company best. Around 56% of respondents also indicated that security considerations will be impacting final decisions around choosing a provider in a big way.

42% Considering Multi-Cloud Strategy

So as we see that organizations are indeed moving toward multi-cloud strategies, it’s important not to overlook how private cloud environments hold onto their importance for organizations trying to make the best placement decisions when it comes to workload-from-cloud.

Here are key findings from the survey:

  • Microsoft Azure leads the way as the most popular public cloud provider (58%), and next are Google Cloud (41%), IBM (40%) and AWS (38%). Note that of the respondents only 1% stated having always been with the same cloud provider or platform.
  • Nearly half (46%) of respondents have gone with a new provider or platform within the last year — and more than 25% of them have made that move sometime in the past 6 months
  • Just 1% of respondents indicated having the same cloud provider or platform since their initial foray into cloud computing
  • 42% of respondents are pursuing a multi-cloud strategy
  • The vast majority of private cloud users (80% of them) stated better information security was their primary reason for going with a private cloud environment.
  • 89% of healthcare organizations and 81% of public sector interests foresee a continuing need for their private cloud over the next 5 years and beyond

Following all the major disruptions seen over the last year and some, the need for a modern technology stack is pushing cloud adoption hard in every industry. Capitalizing on the opportunities available in the market right now means cloud providers must meet these organization’s complex security and legacy workload needs.

Password Hygiene? It’s a Thing

You might not know it, but the word hygiene has Greek mythology roots. Hygieia was a daughter of Asclepius, who you probably also didn’t know was the Greek god of medicine. Hygieia was the goddess of health, cleanliness and sanitation, and so that pretty much makes sense in as far as where the word comes from. We all know how it’s important to brush our teeth everyday, but apparently it’s possible to be healthy, clean, and entirely sanitized with those digital gatekeepers we call passwords.

We’ve all seen password suggesters that give you an idea of how suitably strong your password is, but maybe far to many people are going with 54321 or something of the sort. Here at 4GoodHosting we’re like any other good Canadian web hosting provider in that we’ve come across all sorts of stories of password failures over the years and we try to make a point of giving customers some insights into good practices for the digital world if they need them.

And apparently the need is there. Passwords are still the primary form of authentication, but done poorly they can leave you vulnerable to attacks if your cybersecurity is not up to scratch. Passwords get stolen, and it’s happening a lot more often nowadays. They’re obtained by all sorts of underhanded means, and you may have some of yours that aren’t exclusively in your possession anymore too.

Billions Out There

At present there are billions of passwords available on the Dark Web, collected via various attack methods ranging from malware to phishing for them. Many are then used in password spraying and credential stuffing attacks.

The primary reason this is able to happen, according to web security experts, is that around 65% of users re use some of their passwords. That’s highly inadvisable, and if you do it then you put yourself at risk of stolen or compromised credentials. There’s another estimate that 1 in 5 companies who suffered a malicious data breach had it happen because of stolen or compromised credentials.

So what is poor password hygiene? It’s really any type of choice or omission with setting or sharing passwords that leaves doors wide open for attackers. If you’re the IT department with what you’ve got going on, your lack of knowledge about good password practices may be putting you at risk.

Put Effort into It

Choosing weak, easily guessable passwords like common keyboard patters or passwords that are obviously connected to an organization name, location or other common identifiers is where a lot of people mess up. Another common move is changing passwords only by adding sequential characters at the end. An example would be changing password1 to password.

A great example of this is what happened to the Marriot hotel chain. Just last year attackers obtained the login credentials of two Marriott employees and then compromised a reservation system and ultimately exposed payment information, names, mailing addresses, and much more for more than hundreds of millions of bonehead customers.

Why It Continues

Poor password hygiene is continuing to be a problem because it’s not visible enough as a problem or a potential threat. And thinking that attackers are only interested in targeting large organizations is incorrect too. Attackers do target SMBs and do it more often with the increasing frequency of online applications and remote technologies that can be compromised fairly easily a lot of the time.

The security of two-factor authentication is overrated and another common misconception for people. Two-factor authentication is a good security measure, but it’s certainly not fail safe. You still need your password to be as fully secure as possible.

And with Active Directory (AD), there is the belief that their password policy in AD is going to be sufficient. But it does not eliminate the use of compromised passwords or have anything to indicate the use of weak password construction patterns. You also shouldn’t think that implementing and enforcing a robust password security policy is going to create any degree of user friction.

Simplifying Password Security

Here are some fairly solid recommendations:

  • Choosing a password with a minimum length of 8 characters to encourage the use of longer passwords
  • Removing password expiration and complexity
  • Screening new passwords against a list of passwords known to be leaked / compromised

You also need to take risk level into account. Removing expiration guidelines can lead to a security gap given how long it takes organizations to identify a breach. It’s a good ideal to go with technical solutions that can reduce the poor password hygiene issues these can create.

Other good practices are:

  • Eliminating the use of common password construction patterns
  • Support user-oriented features such as passphrases (more memorable longer passwords) and length-based password. This also promotes less frequent password expiration because of how lengthy and strong the passwords
  • Continuously blocking use for leaked passwords
  • Making users able to reset their passwords with MFA (multi-factor authentication) from anywhere,
  • Work with existing settings you already use such as Group Policy

This is something that you want to be proactive about, and it’s really not asking too much of people to come up with a more solid and secure password. Go through what can happen if you have a weak password and you’ll know why that’s a headache you really want to avoid.