The Coming Significance of Platform Engineering in Web Development

A cobbler is the person who will repair a shoe, although it is increasingly rare that anyone gets a pair of shoes repaired these days. In fact most shoes aren’t even made with any real possibility of reattaching a sole or anything similar. In the 21st century you can even say it’s more likely that someone will be putting those same energies into some type of digitally-built tool or resource and one that people will likely get a lot of mileage out of as compared to an inexpensive pair of sneakers. ‘Cobbling’ in that sense is putting together a collection of disparate tools and then making them works as well as can be expected when working with web development.

That’s our roundabout way of introducing platform engineering, which if you haven’t heard of it is likely to be the next big thing in development and a means of bringing technologies together with more functional compatibility for builds that come together with a) much more speed, and b) a whole lot more in the way of getting to a ‘finished’ product sooner. What this stands to be replacing is the way the different home-grown self service frameworks have – for the most part – shown themselves to be brittle, high maintenance, and often way to expensive when the entirety of everything is taken into account.

We’ll get into the meat of what more may be possible with this here, but it goes without saying that here at 4GoodHosting we’re like any other quality Canadian web hosting provider in seeing how this may be a very relevant technological development for the people who do the work behind the scenes that make our digital world what it is and made available with all it’s capable of doing. Interesting stuff for sure.

Engineering Made Common

More succinctly, platform engineering is the practice of building and operating a common platform made available for internal development teams for sharing and using with relation to software release acceleration. The belief is that it will bridge the gap between software and hardware and platform engineers will enable application developers to release more innovative software in less time with much more in the way of efficiency.

Not only if this super relevant for platform engineering, but it is potentially big for the hybrid cloud too. The aim is to have self-service as both a hallmark in DevOps maturity and a key platform engineering attribute that is proven for supporting developer-driven provisioning of both applications and any underlying infrastructure required along with it. The value in that should be self-evident, as today application teams working in hybrid and multi-cloud environments require different workflows, tools, and skillsets across different clouds

This means complexity, and often too much of it for things to get done right. The need is for getting to a final product asap, and the challenge shines a spotlight on the growing demand for a unified approach to platform engineering and developer self-service.

Ideal for Scaling DevOps Initiatives

There are estimates that upwards of 85% of enterprises will come up short with their efforts to scale DevOps initiatives if no type of self-service platform approach is made available for them by the end of next year (2023). To counter that the recommendation is that infrastructure and operations leaders begin appointing platform owners and establishing platform engineering teams. But the important point is that they be building in self-service infrastructure capabilities that are in line with developer needs at the same time.

Similar estimates that by 2025 75% of organisations with platform teams will provide self-service developer portals as means of improving developer experience and giving product innovation a real boost given how much more quickly results are seen and developers are given a better understanding of where the current development path is likely to take them.

Compelling Benefits

Accelerated software development is the big carrot at the end of the stick leading the investment into platform engineering development. By pursuing it with such enthusiasm it ensures application development teams bring that productivity to all aspects of the software delivery cycle. Ongoing examination of the entire software development life cycle from source code to test and development and along into provisioning and production operations is the best way to facilitate the right backside of the development equation.

From there what it promotes is much better processes and platforms that enable application developers to rapidly provision and release software. Teams will probably be using infrastructure automation and configuration management tools like Ansible, Chef, or Puppet. These tools are conducive to continuous automation that extends processes used in software development to infrastructure engineering.  Look for code (IaC) tools such as Terraform to work well for codifying task required for new resource application and continue playing key role in raising the growth of platform engineering and platform engineering teams.

Most notable at this time is Morpheus Data. It is a platform engineered for platform engineers and is set up explicitly to allow self service provisioning of application services into any private or public cloud.  When combined with Dell VxRail hyperconverged infrastructure you will see digital transformations sped up impressively and you’ll also be improving on your cloud visibility at the same time.

FAXID For Much Faster Ransomware Detection

Long gone are the days when you actually had to have a captive individual in order to demand a ransom. Nowadays that would even be very uncommon, as much more often it is digital property rather than a person that’s been captured and the takers are looking to get paid if that property is to be released. We’ve gone on here at some length about how costly it can be for some companies when they choose to be lax about cybersecurity, and especially nowadays. The age old dance with all of this remains the same – security improves, threats evolve, security improves to counter those evolutions, and then threats evolve again.

And of course it’s the bigger fish that need to be concerned about frying. If you’re one the smaller side of the scale when it comes to running a business you probably won’t get targeted, but there’s still not guarantee you won’t be. We don’t claim to be web security experts, but here at 4GoodHosting we’re like any good Canadian web hosting provider in that we can point you in the direction of one of them if that’s what you need. We do have an understanding of the basics on the subject and that’s part of the reason why we’re fairly keen to share any news related to it here, especially when it means even better means of avoiding a ransom situation.

So what is newsworthy here is a new technology that is in the process of proving itself to be MUCH faster for identifying ransomware attacks and detecting them early enough so that countermeasures can be implemented – something that will be part of a complete cybersecurity plan that is a much for any business that is of a sufficient size that there’s potentially serious loss if data is accessed and then taken for ransom.

Malware Meeting Match?

A new approach for implementing ransomware detection techniques has been developed by researchers, and the appeal of it is that is able to detect a broad range of ransomware far more quickly than using previous systems. We will at this point assume we don’t need to provide much of an explanation about what ransomware is here, but if we do then it is a type of malware and when ransomware infiltrates a system, it encrypts that system’s data and it becomes immediately inaccessible to users.

What will follow next are the demands; the people responsible for the ransomware make it clear to the system’s operators that if they want access to their own data they had better be sending money. And this type of digital threat has already proved plenty expensive. The FBI says they received 3,729 ransomware complaints in 2021 and the amount paid out in ransom is around $49 million. That’s a lot of money, and it makes clear why the attackers are going to lengths they are to improve on the sneakiness of their ransomware and then putting it out there.

We do know that computing systems already make use of a variety of security tools that monitor incoming traffic with an eye to detecting potential malware and preventing it from breaching the system and new ransomware detection approaches are being evaluated all the time by many different interest groups and developers. A lot of it is very effective IF it can be implemented in a timely way.

The challenge here is detecting ransomware quickly enough to prevent it from fully establishing itself in the system. File encryption begins as soon as ransomware enters the system, so if the counterattack can be made aware of it’s time to go then that is boing to be very beneficial.

FAXID Pairs with XGBoost

What’s getting buzz these days and why we are on this topic is a machine-learning algorithm called XGBoost. It has been proven effective for detecting ransomware for some time, but up until now when systems run XGBoost as software through a CPU or GPU it doesn’t run quickly enough. Add to that attempts to incorporate XGBoost into hardware systems haven’t gone as well as hoped because of a lack of flexibility.

By focusing on very specific challenges it becomes difficult or impossible for them to be on top of the entirety of ransomware attacks types and being able to identify them as soon as needed.

But this new FAXID technology is a hardware-based approach that allows XGBoost to monitor for a wide range of ransomware attacks and do so much more quickly than with the existing software approaches

Not only is FAXID just as accurate as software-based approaches at detecting ransomware, but the speed it could so in with was drastically faster. FAXID was up to 65.8x faster than software running XGBoost on a CPU and up to 5.3x speedier as compared to software running XGBoost on a GPU.

FAXID is also getting high marks for the way it allows us to run problems in parallel and rather than allocating all the security hardware’s computing power to separate problems you could devote some amount of the hardware to ransomware detection and another percentage of the hardware to another challenge like fraud detection or some other identified threat that may be present in unison.

This has a lot of potential for cybersecurity as a whole given the current atmosphere where ransomware attacks are becoming much more sophisticated. People in business should be thankful these types of advances are being made as it may be contributing to preventing them from quite the expensive headache in the future.

3rd-Party Web Trackers Logging Pre-Submission Information Entered

Canadian Data Protection

Anyone and everyone is going to be extra mindful of what information is shared digitally these days, and even most kids are aware of the fact that you can’t be entirely at ease about what you type into submission fields and then press ‘Enter’. You need to be mindful of what you share, but it turns out you need to be the same way before you even press the enter button at all. Many people may think they’ve smartly avoided any potential problems by backspacing over something they’ve typed and were about to submit, but it turns out the damage may already be done.

We’ll get to what exactly is at issue here, but before we do we should make clear that ‘leaks’ don’t always end up being what they are on purpose. Many times there is information exposed not because someone is choosing to do so, but rather because the information is contained in location that doesn’t actually have the security protocols owners / users will think that it does. Truth of the matter it is nearly impossible to be airtight with this stuff 100% of the time.

Here at 4GoodHosting we’re like any other good Canadian web hosting provider in that we like to share information with our customers anytime we find example of it that we know will have real significance with them. This is one of those scenarios, as nearly everyone is going to be choosing to voluntarily provide information about themselves when asked to do so online. Any way you can be more in the know about dos and don’ts when it comes to this is going to be helpful, so here we are for this week.

Made Available

A recent study that looked into the top 100k ranking websites is indicating that many are leaking information you enter in the site forms to third-party trackers, and that this may be happening ever before you press submit. The data that is being leaked may include personal identifiers, email addresses, usernames, passwords, along with messages that were entered into forms but deleted and never actually submitted.

This type of data leak is sneaky because until now internet users would assume that the information they type on websites isn’t available unless they submit it. That IS true most of the time, but for almost 3% of all tested sites there is the possibility of once it’s typed out it’s already been made available and that’s the reality even if you don’t actually submit the info.

A crawler based on DuckDuckGo’s Tracker Radar Collector tool was what was used to monitor exfiltration activities, and the results do confirm that this is very much a possibility and there’s not much if anything that could be seen as tip-off for users to indicate to them when this risk is present and where information should ideally not be entered into the field at all.

Nearly 19k of Sites

The crawler was equipped with a pre-trained machine-learning classifier that detected email and password fields as wall as making access to those fields interceptable. Then the test of 2.8 million pages found on the top 100,000 highest ranking sites in the world, and then found that of those 100k 1,844 websites let trackers exfiltrate email addresses before submission when visited from Europe. That is not such a high percentage, but for the same ratio in America it’s an entirely different story.

When visiting those same websites from the US, there were a full 2,950 sites collecting information before submission and in addition researchers determined 52 websites to be collecting passwords in the same way. It should be mentioned that some of them did make changes and efforts to improve security after being made aware of the research findings and informed that they were leaking.

But the logical next question here is who is receiving the data? We know that website trackers serve to monitor visitor activity, derive data points related to preferences, log interactions, and for each user an ID is created and one that is – supposedly – anonymous. Trackers are used by the sites to give a more personalized online experience to their users, and the value for them is having advertisers serve targeted ads to their visitors with an eye to increasing monetary gains.

Keystroke Monitoring

The bulk of these 3rd-party trackers are using scripts that monitor for keystrokes when inside a form. When this happens they then save the content, and collect it even before the user has pressed that submit button. The outfall of this then becomes having data entered on forms logged but losing the anonymity of trackers to push up privacy and security risks big time.

There are not a lot of these trackers out there, and most of the ones that are in operation are known by name. 662 sites were found to have LiveRamp’s trackers, 383 had Taboola, and Adobe’s Bizible was running on 191 of them. Further, Verizon was collecting data from 255 sites. All of this is paired with the understanding that the problem stems from a small number of trackers that are prevalent on the web.

So what is a person or organization to do? The consensus is the best way to deal with this problem is to block all 3rd-party trackers using your browser’s internal blocker. A built-in blocker is standard for nearly all web browsers, and it is usually found in the privacy section of the settings menu.

Private email relay services are a smart choice to because they give users the capacity to generate pseudonymous email addresses. In the event someone does get their hands on it, identification won’t be possible. And for those who want to be maximum proactive there is a browser add-on named Leak Inspector and it monitors exfiltration events on any site and provides warnings to users when there is a need for them.

Edge Now Second to Only Chrome as Web Browser of Choice

We can go right ahead and assume that there are so many Mac users who opt to use Chrome as their browser rather than the Safari that their device came with. We say that with confidence because we’re one of them, and it is a fact that Google’s offering continues to be the de facto choice as a web browser for the majority of people all around the world. There’s plenty of reasons for that, although at the same time we will be like most people and say that both Safari and Firefox aren’t bad per se. Internet Explorer on the other hand is an entirely different story.

Now to be fair if IE hadn’t been left to wither on the vine that might not be the case, but the fact it was played a part in why the Edge browser has made the inroads into the market it has. But as always choice is a good thing and if anything it puts the pressure on the runner ups to get better to reclaim whatever user share they’ve lost. So competition joins choice as a good thing. This is one topic that everyone can relate too, and it’s been a topic of discussion in nearly every office here in North American and likely elsewhere around the globe.

Like any good Canadian web hosting provider we’re no different here at 4GoodHosting, and you can know that those of us around here have the same strong opinions about which web browser is best and why. Likely you have much the same going on around your places of productivity, so this is the topic for our blog entry this week.

Closed the Gap

February of this year had Microsoft Edge on the cusp of catching Safari with less than a half percentage point separating the 2 browsers in terms of popularity among desktop users. Estimates are now that Edge is used on 10.07% of desktop computers worldwide, and that is 0.46% ahead of Safari who has now dipped down to 9.61%.

Google Chrome is still far and away the top dog though, being the browser of choice for 66.58% of all desktop users. Mozilla’s Firefox isn’t doing nearly as well as either of them, currently with just 7.87% of the share. That’s quite the drop from the 9.18% share it had just a few months ago.

Edge’s lead on other browsers, however, needs to be quantified depending on location. If we are to look at just the US, Edge trails Safari by quite a bit with only 12.55% of market share as compared to Safari’s 17.1%. In contrast Edge long ago passed Safari on the other side of the pond, with 11.73% and 9.36%, respectively in Europe.

And for Firefox it’s not looking promising at all, despite it being what we consider a very functional browser that doesn’t really come up short in comparison to others if you look at it strictly from the performance angle. Yes, it doesn’t have the marketing clout of either Microsoft or Google and that means brand recognition won’t be the same.

Long Ago in January 2021

As the default Windows 11 browser, the popularity of Edge has gone up quite a bit. We talked about February of this year, but let’s go back one year + 1 month further even to the start of 2021. There were concrete signs that Edge would be passing Safari for 2nd place in user popularity, and at that time the estimate was that it is being used on 9.54% of desktops globally. But back in January 2021 Safari was in possession of a 10.38% market share, and so what we are seeing is a gradual decline in popularity over the last year plus.

Chrome continues to move forward with speed though, even if it’s not ‘pulling away’ at all. It has seen its user base increase ever so slightly over that time, but at the same time Firefox has been losing users since the beginning of the year. And that is true even though Firefox hasn’t been at rest at all and has made regular updates and improvements to their browser.

So perhaps Apple and Safari can take some consolation in the fact they’re holding on third place quite well, but the reality is they have lost 0.23% of market share since February. However, we should keep in mind that Apple has hinted that it may be making sweeping changes to the way Safari function in macOS 13 towards the end of 2022.

Different for Mobile

It’s a different story for mobile platforms, and that can be directly attributed to Microsoft’s lack of a mobile operating system since Windows Mobile was abandoned. In this same market analysis Edge doesn’t even crack the top 6 browsers for mobile, while Chrome has 62.87% of usage share and Safari on iPhones and iPads coming in at 25.35% for a comfortable second place. Samsung Internet comes 3rd with 4.9%.

Overall statistics for desktop and mobile – Chrome 64.36% , Safari 19.13%, Edge 4.07%, Firefox 3.41%, Samsung Internet 2.84%, and Opera 2.07%.

It is true that Safari for desktop has received complaints from users recently because of bugs, user experience, and with matters related to website compatibility. Apple’s Safari team responded to that by asking for feedback on improvements and to be fair it did lead to a radical redesign of its browser. May of them were rolled back before the final version was publicly released in September.

New ‘Declaration of the Future of the Internet’ Signed Onto by More than 60 Countries

Go back some 30 years and those of us who anywhere past adolescence by that time would be dumbfounded to learn just how life-changing this new ‘Internet’ thing would become, along with being impressed with dial-up modems in a way that would seem bizarre nowadays considering where that technology has gone in such a short time. As with anything there’s been growing pains with the Internet too, and like any influential and game-changing technology it has been used for ill in the same way it’s provided good for society.

It’s also become an integral internationally shared resource, and that goes beyond just the sphere of business. The inter connectivity of the modern world is increasingly dependent on the submarine cables laid across entire ocean floors so that the globe can be connected by the Internet, and here at 4GoodHosting we are like any good Canadian web hosting provider in that it’s something that is near and dear to our hearts given the nature of what we do for people and the service we provide.

This is in part connected to the need to safeguard the future of the Internet, as there are so many complexities to it that didn’t exist previously and no doubt there will be more of them in the future. This is why the recently-signed Declaration of the Future of the Internet is such a big deal and more than worthy of being the subject for this week’s blog entry here.

Protecting Democracy & More

One of the ways that the Internet has most notably been abused is to threat democratic institutions like the legitimacy of election results and the like, and there’s no doubt that there are anti-North American interest groups in other parts of the world that are using the Web as the means of infiltrating and being subversive withing democratic institutions. The belief is that if no efforts are made to nip this in the bud or counter it now then it may become too big to rein in in the future.

This is why there was such a push to get countries onboard for this declaration now, and it seems there was enough enthusiasm and resolve to see it through. The Declaration of the Future of the Internet is to strengthen democracy online as the countries that have agreed to its terms have promised they will not undermine elections by running online misinformation campaigns or illegally spying on people. At least this is according to the White House.

More specifically what the declaration does is commit to the promotion of safety and the equitable use of the internet, with countries signed on agreeing to refrain from imposing government-led shutdowns and also committing to providing both affordable and reliable internet services for their populous. This declaration isn’t legally binding, but countries signed on have been told that if they back out they will get a very disapproving finger wag from Sleepy Joe at the very least.

Bigger Picture Aim

What this declaration is more accurately aiming to do is have the principles set forth within it will serve as a reference for public policy makers, businesses, citizens and civil society organizations. The White House put out a fact sheet where it provided further insight on how the US and other partners will collaborate to safeguard the future of the internet, saying they and their partners will work together to promote this vision and its principles globally, but with respect for each other’s regulatory autonomy within our own jurisdictions. Also being in accordance with our respective domestic laws and international legal obligations.

60 and Counting

So far 60 countries have committed to the declaration and there is the possibility of more doing so in the next little while. Russia, China and India were the notable absents and while India is a bit of a surprise the other 2 are not considering the reasons they might have for interfering into democratic processes and utilizing the web within the most effective ways of making that happen. Google is among the US-based tech giants endorsing the declaration, and their assertion is that the private sector must also play an important role when furthering internet standards.

What is likely is that something similar will be required every couple of decades so moving forward, and particularly if the web is to make even deeper inroads into life beyond a shallower level. It certainly has shown it has the potential for that, and that potential is likely growing all the time.