Pros to a USB-C Hub Monitor

Reading Time: 5 minutes

There’s been plenty of instances where we’ve joined in with the consensus and stated that you can’t stop progress, and truth be told we’re of the mindset you wouldn’t want to anyway most of the time. Especially when it comes to anything that can make our workday better and up our productivity – something that is important if you’re working for yourself or for somebody else in an employee capacity. And for so many of us that involves working in the digital realm on personal computing devices.

So it’s here that we are looking at the benefits of a USB-C Hub Monitor, and of course the primary and most obvious reason they are a practical choice at the very least is because nowadays so many of us need to run peripherals from our main device. External power bricks may have been the norm for a while now, but we’re willing to guess they’re going to start making their way towards obsolescence. And that’s a good thing as less is always more and preferable when it comes to any desktop workstation.

All this resonates with us here at 4GoodHosting in the same way it would with any Canadian web hosting provider given the way our work puts the same premium on efficiency and getting a good job done as quickly but thoroughly as possible. So let’s spend this entry looking at what makes USB-C Hub Monitors such a good choice for anyone who’s not sitting at their desk because they could be doing something different.

Smart Upgrade

A USB-C hub monitor is every bit a smart upgrade for your home office. They remove the need for numerous cords from your desk and rather than connecting peripherals to your PC, you can instead connect them directly to the monitor to connect to your PC over USB-C. Desktops get added functionality too, but it is laptops that benefit most from USB-C. They can act as both a video cable and power cable in and that means you don’t have to have the device’s standard power adapter resting on the desktop or hanging to floor beside it.

Most who own a USB-C compatible laptop will need a USB-C hub or dock as it is, and so by bundling it with the monitor you won’t need to find space for a separate hub or dock there. The best of them will include Ethernet, multiple USB-A ports, and support daisy-chain displays over the unit’s DisplayPort. A USB-C monitor that can handle numerous peripherals at once is quite the treat when you put that versatility to work.

For the ultimate clean setup you can’t beat buying a monitor arm with a laptop stand and use wire clips to router wires behind the monitor arm. You’ll remove all wires from sight and only the USB-C cord that connects the laptop to the monitor will be visible.

Be ‘Hub’ Specific & Demanding

It’s important to know that USB-C hub is not a standardized term, so you need to make sure you’re getting what you need and not something that might be misleading with how its named. A USB ‘Hub’ is a term used to describe devices that extend USB connectivity. That’s it, and this is why you should pay attention to a monitor’s specifications. The benefits of using many of them as a USB-C hub are limited.

 

The best USB-C hub monitor are ones featuring at least four additional USB ports and a mix of USB-A and USB-C, and it is even more preferable if it comes with an Ethernet port if any devices doesn’t have an Ethernet port as many notebooks and laptops don’t.

Display & Power Interests

There isn’t any USB version that includes a video standard included in any base specification. What you will see is that USB-C devices handle video via an optional addition called DisplayPort Alternate Mode. USB-C monitors will list the version of DisplayPort supported by the USB-C port in the monitor’s specifications, and this is also something you will want to be looking for.

The version of DisplayPort used by the monitor isn’t something you need to consider, as any that supports video over USB-C will use a DisplayPort version that is sufficient for driving the monitor at its native resolution and refresh rate. But do check that a USB-C port with DisplayPort Alternate Mode is available on the PC you plan to connect to the monitor. Not all PCs that have a USB-C port that supports this, and DisplayPort version will matter if you connect to a device with a high refresh display.

Most USB-C devices support power delivered by USB, which is a requirement for supplying power over USB. The power standard here will vary though and to quite an extent for different devices. The wattage available can vary from a few watts to 100W or more. Determining if the USB-C monitor can provide enough power for any devices is another consideration for you.

That can be done by checking the monitor’s specifications to find out how much power it can send over USB-C. Compare that number to the rated wattage for your laptop’s power adapter. The monitor should support USB Power Delivery at an equal level to wattage supplied by the laptop power adapter.

Thunderbolt is Best

Thunderbolt Hubs are similar to USB-C ones, but they have some distinct advantages. For starters they have a higher minimum data rate than USB. Thunderbolt 3 and Thunderbolt 4 must support a data rate of 40Gbps, while USB 3.1 supports a minimum data rate of 10Gbps, and USB4 supports a minimum of 20Gbps.

The higher data rate you get with a Thunderbolt Hub could be important if you want to connect multiple high-speed storage devices or implement external graphics. Power delivery stays variable too, so you still need to check the power delivered by a Thunderbolt hub monitor is enough for the device you want to connect.

The last mention we’ll make here if for USB4, the latest USB standard. It stands ready to increase the minimum data rate for USB to 20Gbps and support a maximum of 40Gbps. Which will pair especially well with the latest version of USB Power Delivery supporting up to 240 watts of power. This is going to allow easy, single-cable connections with many laptops that would currently be incompatible for that reason.

 

 

Now the bad news: Compatible devices are rare. Only a few USB-C hubs or docks are available and none of the USB-C monitors shown at CES 2022 announced support for USB4 or the 240-watt Power Delivery standard.

 

USB4 makes inclusion of Power Delivery standard and has the option to support the latest up to 240 watts of power, but it doesn’t increase the required minimum wattage. You’ll still need to keep a close eye on exactly how much power a USB-C monitor provides.

 

Conclusion

USB-C hub monitors can be a bit confusing, but deciphering the details is worth the effort. You’ll need to do the same mental gymnastics to buy the right USB-C hub or dock, anyway. Choosing a USB-C hub monitor over a standalone dock will offer the same benefits and save space on your desk.

 

Intel and Their Aim to Dominate Data Chips

Reading Time: 3 minutes

To lean on something means to rely on it, and with that understood it is fair to say companies are leaning on big data more and more by the day. All of this was foreseen with the way digital connectivity was revolutionizing business 20+ years ago, but here we are now with data centre capacity being more of an issue than ever before. The demand for chips that can meet to those capacity needs has been negatively impacted by the worldwide chip shortage going on right now, and that’s a reflection of just how much of private, public, and municipal life has gone digital.

This is something that any web hosting in Canada provider can relate to, as most will be like us at 4GoodHosting in that we’ve invested in data centre expansions so that we have them in different major regions of the country, and BC and Ontario when it comes to us. It’s an essential part of us being able to offer the uptime guarantees we do with our Canadian web hosting packages. It’s elementary that we have an understanding of the data and collocation centres, and as such recent news of Intel’s new Xeon roadmap is definitely something given what the chipmaker giant is aiming for.

Intel has revealed ambitious plans to establish a degree of dominance in the data centre chip landscape in the immediate future. They have recently revealed their multiyear Xeon roadmap for the next few years, including the heralded Sapphire Rapids, with a ground breaking new chip built on Intel’s 7 process which is estimated to boost AI performance by a massive 30x.

Better serving tech always sound really good around here, so let’s spend this week’s entry looking at the new Intel data center chip and all that it may offer to improve user experiences.

Rapid Expansions

This new chip is going to be an introductory offering that will be followed by Emerald Rapids. The follow-up Emerald chip is also built on Intel 7 and should arrive in 2023, and then again with Sierra Forest built on Intel 3 set to come out in 2024. We’re also seeing Intel planning to move its Granite Rapids offering from Intel 4 to Intel 3 on the same timeframe.

The various Rapids versions are aimed directly at the top of the CPU market, and the most immediate appeal of them is with performance leaps across AI and ML workloads. The way people have taken to he Cloud is a factor here, as Sapphire Rapids is especially focused on data centres and that works in line with one of the most relevant topics in computing these days.

Intel is stating that it offers better performance across a multitude of workloads useful to data centres and the processor class is set to ship next month. Any timeframe for the others is yet to be determined, and of course that is based on just how long next-generation technology takes in regard to proper development

The reality is though that the competition is nearly keeping pace and the company has challengers, and especially Amazon as they have recently begun exploring the development of their own silicon for data centres.

Focus on Developer Ecosystems

The Xeon roadmap is expected to be industry leading, and industry experts do laud it as having a product portfolio that’s developed according to user realities for the majority of Intel customers and based on their diverse needs, as well as aligned to their timelines and designed to be conducive to developer ecosystems and real innovation within them.

The chip will be the first Intel server processor to use extreme ultraviolet lithography, the deployment of which is a key technology if Intel is to catch up with TSMC and other top chip manufacturers. Intel officials said on Thursday that by 2025 the company plans to reach 10% annual sales growth but with moderate revenue growth this year plus entering an investment phase at this time where they expect at least $1 billion in negative free cash flow in 2022 occurring alongside capital spending increases.

Better Patching Making For Better Online Security

Reading Time: 3 minutes

There’s seemingly no stopping the trend that every new day we are facing the greatest risks ever seen when it comes to being online, and that’s why cybersecurity in an ongoing big deal for both individuals and organizations. No one likes to be the recipient of malware or to have their private information exposed when they are just going about their day to day online, but it’s a very real possibility.

Companies that make digital products should be proactive in making sure those products are safe to use when connected to the web, but that is something that’s been slow to come around on a grander scale. Fortunately now it is though, and major ones like Google, Apple, and Microsoft have been much better about finding the right preventative fixes and making them available to people in a timely manner.

Here at 4GoodHosting we are like any quality Canadian web hosting provider in that we can relate to the importance of security patches being made available, especially for ones that have no choice but to handle sensitive data provided by customers or business associates when working together with them. Nowadays as cybersecurity threats becomes more pronounced and far reaching, the means of addressing and thwarting them is advancing in step better than at any time before.

93.4% Fix Rate

New research from Google indicates companies are getting much better at fixing security vulnerabilities found in their products. Many firms also now taking less time to address various issues along with going past their established deadlines for patch fixes less frequently than in previous years.

Project Zero is Google’s team of security analysts tasked with finding zero-day vulnerabilities. These are unknown or unpatched flaws that can be abused through malware. The team recently published a blog post pointing out 376 issues it found between 2019 and 2021 and then detailing how vendors responded to the findings, and what the overall successes of those responses meant for overall cybersecurity in the digital realm.

Of those 376 issues, 351 of them (93.4%) have been fixed and only 14 (3.7%) have not had any type of fix applied to them. 11 (2.9%) remain active but 8 of those were classified as having already passed their 90-day deadline.

Google, Microsoft, and Apple Doing Best

Roughly two-thirds of all these vulnerabilities (65%) are attributable to these 3 major companies. Microsoft has had 96 (26%), Apple 85 (23%), and Google 60 (16%). In the evaluation 90 days was the deadline for a vendor to fix an issue and ship an improved version to its customers’ endpoints. A 14-day grace period was made available if the vendor asked for it while still promising to deliver a patch fix.

Apple did best with all the reported vulnerabilities, fixing 87% of them within that 90-day window. Microsoft came in second at 76%, and then Google with 53% fixed. Microsoft has had the most patches issued during the grace period (15 flaws, or 19%). Google was best with seeing to them fastest – an average of 44 days to fix a problem compared to Apple’s 69 days or Microsoft’s 83 days.

These numbers are more significant when you look at them in the comparison to how long it took these 3 to achieve the same thing in years previous. It took Apple 71 days to fix an issue on average in 2019 and in 2020 it was 63. It took Microsoft 85 days to do it in 2019 and then moving up to 87 for 2020. Google didn’t move much either way, but these companies have been consistently cutting down on time required for addressing various vulnerabilities.

The good news is that now vendors are fixing almost all of the bugs they get, and doing it relatively quickly. The past three years have seen accelerated patch delivery too, having learned best practices from each other as well as the influence of increasing transparency in the industry too.

Paid Rewards

Google has a Vulnerability Reward Programs (VRP), and through 2021 Google and the wider cybersecurity community have discovered thousands of vulnerabilities. Some of which have been fixed by those outside of the company for paid rewards. The sum of which is apparently in the vicinity of $800k. Nearly 700 researchers have been paid out for their hard work in discovering new bugs, with the highest reward being $157,000 and going to a researcher who discovered an exploit chain in Android.

The Android VPR paid out twice what it did last year, rising to almost $3 million. A total of 115 Chrome VRP researchers were rewarded for 333 unique security bugs found. and payouts for that totalled into the millions.

Brain Synapse Function Possibly a Part of the Next Generation of PCs

Reading Time: 3 minutes

100 billion is an incredibly big number, and yet a fully developed human brain can have up to as many as 100 billion neurons in it as part of the extensive neural network that provides the brain with the framework it needs to be the amazing mega processor that it is. Much of the focus with A.I. in computers has been to replicate the function of the human brain as best as possible, and to date that’s happened with varying measures of success along with difficulty in measuring the criteria for that.

The key conduits in a brain’s neural network are synapses, and these are literally the bridges between cells along which bioelectrical impulses provide cognitions, impulses, feelings – pretty much and everything and anything that you might have rooted in mental function. Those impulses have their roots in the different cortex centres of the brain, and in much the same way they are found in the chips of computers and the sort. Up until now the relaying function of these chips lost something of its power and authenticity, but that may be a shortcoming that has a fix possibly arriving soon.

New developments of chips modeled after the brain’s neural network are really making waves based on what they can do for expanding on the capabilities of computing devices. Here at 4GoodHosting it goes without saying that this is a topic of interest for any Canadian web hosting provider or any other type of provider that has inherent interests when it comes to making devices capable of more while keeping them suitably compact and usable.

It’s certainly something that can reach out and be beneficial to everyone, especially when you think about what it’d be like to have computers that are as sharp as what we’re all lucky enough to have between our ears. It may be a reality in the not-too-distant future, and so that’s what we are going to look at this week.

World’s First Electrochemical 3-Terminal Transistor

All of this has to do with a new material that has been developed, an electrochemical 3-terminal transistor manufactured with 2D materials. The key component here is a titanium carbide compound called Mxene that takes classical transistor technology into a whole new stratosphere of transmission possibilities for transistors. The relevance is that it is the first electrochemical 3-terminal transistor manufactured with 2D materials.

This is what allows it to function more in line with how a brain would with maintain signal integrity and allowing it to have all the nuanced complexity that it needs to have. With these new chips the electrochemical random access memory (ECRAM) behaves as a synaptic cell in an artificial network, establishing itself as a 1-stop shop for taking data and then processing / storing it. Computers equipped with chips built this way could rely on components that can have multiple states, and perform in-memory computation in ways that would make current capabilities seem pedestrian at best.

Leading to Even More

The further belief among computing science experts is that the MXenes could be fundamental when it comes to developing neuromorphic computers that are closer in operation to human brains and immensely more energy efficient than today’s traditional computers – in same cases thousands of times more energy efficient to go along with all the more detailed and fine computational abilities. A good number of developers will be familiar with CMOS wafer assemblies where layers of 2D are integrated in silicon, and these new chips will do much the same with the 3-terminal transistors. What this will be is a true hybrid integration with the same back of the line processes.

What can be expected? For starters, these super chips would have write speeds that are upwards of 1000x faster than any other ECRAM that has been built to date. If one was to scale 2D ECRAMs to nano dimension the less than a nanosecond processing rate would make them as fast as the transistors in today’s computer so it’s reasonable to think it can fuse into our current computers using CMOS technology process.

That’s due to the 2D transistor metal materials being entirely compatible with CMOS fab process, and the belief is that within a decade users may be able to fabricate special purpose computer blocks where memory and transistors merge to make them at least 1000x more energy efficient than the best computers we have today. AI and simulation tasks could even have 1 million fold energy efficiency for certain algorithms. These new chips are eventually going to be seen in cloud computing services like web hosting and website builders.

The first commercial products with this kind of mega-powerful chip in it may still be a long way off, but industry experts are saying we might see offerings becoming available before the end of the 2020s.