Do’s and Don’ts for Hosted Exchange Migrations

Trends are trends, and the reason there’s often no stopping trends is because there’s a darn good reason everyone’s doing whatever it is. These days one such trend that’s got solid legitimacy behind it is moving from an on-premises Microsoft Exchange deployment to Exchange, and for most people it is nothing short of a huge undertaking. It’s often full of major issues along with considerations and decisions galore, and for a lot of people they won’t know what they’ve gotten into with moving to hosted Exchange until they’re well into the process.

But you’re going to do what you’re going to do, and especially if it’s something you feel you need to do. I remember when I was very young and my grandfather said to me ‘some birds do, and some birds don’t. Some birds will, and some birds won’t.’ I had absolutely no idea what on earth he was talking about but I stared up into the sky anyways. The few birds I saw were flying around being birds like any other and I remember thinking what is it they would or wouldn’t be doing in the first place.

But enough about that. Our discussion today is not necessarily about trends and about who is going to do what. It’s about getting your organization into Exchange Online and for some people it’s full of pitfalls that can make the whole thing far too unpleasant, especially if you have on choice but to continue on with it.

So here’s what we know about what you should do, and what you shouldn’t do.

Don’t underestimate the time required for moving the entirety of data over

A whole bunch of factors can make this a lengthy ordeal. How many users do you have? How much data does each mailbox have stored? Do you have bandwidth constraints? The list can go on. Migrating email to the cloud can take anywhere from a few days to several weeks. In fact, Microsoft can contribute one major slowdown of their own – a less-obvious protective feature of Exchange Online makes it so that inbound sustained connections are throttled in order to prevent system overwhelm risk. A noble aim, but it may have you getting frustrated pretty quick if you’re hoping to continue moving ahead with your migration.

However, once you’re up and running and fully in the cloud you’ll come to appreciate this defense line, which works to benefit the general subscription base. But when you are trying to ingest data you may have it slowing to a crawl. That’s just the way it is, and there may not be a way around so you’ll have to be patient.

Do use a delta-pass migration

A delta-pass migration rather than a strict cutover migration reduces time pressure on you down the line and further on into the migration. With delta-pass migration, multiple migration attempts are made while mail is still being delivered on-premises. For example, the first pass might move everything from Tuesday, Mar 1 backward and then another pass is made later in the week to move the “delta” — or changes — from that day through Wednesday, Mar 4, and then in succession until mailboxes are up to date.

This is a useful technique with each successive migration batch being smaller than the last and taking less time. Your users won’t lose historical mailbox data because theirs already holds their data.

Don’t skip configuring edge devices and intrusion detection systems to recognize & trust Exchange Online

Forgetting or choosing not to may mean your migrations are interrupted because your IDS thinks a DoS attack is happening. The fix though is that Microsoft makes available a regularly updated list of IP addresses used by all 365 services, and you can use it to configure your edge devices for trusting certain traffic flows.

Do start with running the Office network health and connectivity tests

Microsoft offers a comprehensive tool capable of alerting you to routing or latency issues between you and the Microsoft 365 data centers. Speeds, routing, latency, jitter, and more – all covered on your network connection to identify and isolate common issues that could lead to a lessened experience for Microsoft 365 users. This is particularly true for voice applications.

Do plan on implementing 2-factor authentication

A primary advantage to moving to Exchange Online and Microsoft 365 is how you are ablet to use all of the new security features available in the cloud. Tops of them of is the ability to turn on two-factor authentication. It will diminish your attack surface significantly as soon as you turn it on, and since Microsoft has seen to the rewiring of the directory and Exchange security model on its servers to make it work, all that’s required of you is flipping the switch and show your users where to enter mobile phone numbers.

An even better choice is to use the Microsoft Authenticator app to cut down on the security and social engineering risks of using SMS text messages. Now of course deploying Authenticator across thousands and thousands of phones can be difficult, especially with BYOD setups and environments geared for remote work where employees don’t have IT support on hand. SMS requires nothing from the end user and is done entirely by IT. So 2-Factor Authentication really is the better choice.

In a hybrid environment, don’t remove your last exchange server

Keeping at least one Exchange Server running on premises in order to manage users is a cardinal rule for Exchange users who’ve recently made their migration. It is possible to continue to use the Active Directory attribute editing functionality to manage recipients, but it’s not supported particularly well. At least not at this time.

It is preferable to use the Exchange admin console of your on-premises server to manage recipients in a hybrid environment, and without leaving an Exchange Server running in your on-premises deployment you can’t do that. Microsoft has said a solution for this should eventually be made available but even after all this time there’s been little progress toward solving that problem. Really is the only stain on Exchange as of this time, and it doesn’t take away from the overall advantages to it much if at all.

Managed Open Source Increasingly Driving Business Growth

Sharing the wealth is a pretty good rule to go by if you’re able to share it, and there’s been plenty of examples where if you don’t you end up with someone like Robin Hood who will share it for you. When it comes to the world of web development there’s never been any doubt about that, and that’s why source code is made available as open source as readily as it is. The widespread adoption has been of immense benefit to anyone who ‘builds’ anything worthy of mentioning for design and functionality.

Here at 4GoodHosting we’re like any good Canadian web hosting provider in that there’s some of us around here that speak Programmer, but there’s others that don’t speak it at all and that’s alright. Some weeks our entries here may be a little bit more digestible for the less web-savvy of you all, but this likely isn’t going to be one of them. If you’re a coder or if your someone who can appreciate what web development is doing for marketing and promotion capacities for your business then this is something that will be of interest.

Adopting new business strategies or implementing new technology is a proven effective way to grow and compete more effectively. More and more regularly it’s open source technology being tabbed as some seek a competitive edge and more of the latest innovations. A published survey not long ago found that 85% of enterprises reported using open source in their organization and in simple numbers adoption of the software really taken off over the last year. Almost half of these same teams are looking to rely more on open source in response to everything that’s changed (and they’ve learned) over the course of the COVID pandemic.

The Right Fit Now

You will be challenged to find anything around us that is NOT powered by open source today, from mobile phones to household appliances and more. Being able to build on the existing foundation of technology and not be hampered in making use of what you can to build your expansion on it is what open source is all about . Open source and permissive licenses give businesses real agility and the ability to move faster, experiment and innovate to be as competitive as possible in their space.

Open source is transparent and open to inspection, and as a result businesses benefit from the capability to utilize and process their own data independent of how it goes for a single vendor or a single product. Then add to that the open development model and contributions from small and large enterprises and a few select ‘big players’ like Amazon that make it so that open source is consistently at the very cutting edge of innovation.

One huge plus is that bugs in the code can be identified, diagnosed, and resolved quickly. Many have said this alone makes open source software more secure than any proprietary software. However it is true that open source can be more difficult to implement than proprietary software as it’s usually not so plug-and-play in the same way. In order to maintain it you will also need to keep on top of patches and updates.

Because open source software code is built for the community it does come with some challenges. The worldwide open source community doesn’t give direct support for individual businesses using the technology. There are forums, online guides, and elsewhere you can often look and find the information you need.

Add Management

And here is where managed open source enters the picture. It is an express solution to some of the key challenges associated with open source software and lets businesses obtain the best out of open source software without also having to take on responsibilities for maintenance. Managed open source providers handle implementation, maintenance, and security. This frees up the in-house developers to focus more on important work contributing to business growth rather than spending time ‘running things’ on either end.

Open Source and Cloud

It’s expected that the global public cloud infrastructure market will expand massively in 2021 with some expectations being around 35% growth and some $120 billion in sales. What’s driving cloud adoption is what is driving open source adoption in exactly the same way – business agility along with the ability to innovate and experiment at a speedier pace.

In the bigger picture businesses need to find a mix of solutions that fit them and their individual use-cases. For many businesses, that mix will include some combination of open source software and cloud technology. Implementing these technologies with the right support can promote growth, agility, and innovation. Businesses are coming to see how open source can help them and because this trend will continue if you do speak the language it would make sense to be brushing up on open source.

Siloscape: Newest Super Malware Arriving on Scene

No one needs to hear how Malware has become such more sophisticated and far-reaching nowadays, as the topics been beaten to death and everyone knows that cyber security experts are hard pressed to keep pace with them. Well, here we go again with one of the more menacing ones to come out of the void in more recent years. That’s Siloscape, named that way because this is malware that’s primary aim is to escape the container, and what better way than up and out.

To get technical, Siloscape is a heavily obfuscated malware built to open a backdoor into poorly configured Kubernetes clusters and then run malicious containers to go along with other sneaky and up-to-no-good activities. If an entire cluster is compromised the attacker gets served sensitive information like credentials, confidential files, or even entire databases hosted in the cluster. Experts are semi-jokingly comparing this to the novel coronavirus, as this malware bug is pretty darn novel in itself as there’s really nothing been like it before and that’s why it’s generating fanfare.

Unlikely to be as calamitous in the big picture as this darn pandemic though, which is a good thing.

All of this stuff tends to be fascinating enough for those of us here like it would be for any Canadian web hosting provider. Nature of the business and all, and while we have a formative understanding of web security practices there’s no one here who’d be able to pull up the drawbridge in any situation like this.

So let’s have a look at his Siloscape malware and lay out what you might need to know if you’re your own cyber security expert.

Cluster Buster

For anyone who might not know, the reason this is as serious as it is is because Kubernetes is one of the most popular open-source applications around, and for good reason. Containers have been wonderful and that’s why it’s unfortunate Siloscape is engineered to do what it does. So many organizations moving into the club are using Kubernetes clusters as their development and testing environments, and the threat of software supply chain attacks has to be seen as a huge threat.

Compromising an entire cluster is much more of a big deal than just an individual container. Clusters can be running multiple cloud applications and attackers might be able to steal critical information like usernames and passwords, an organization’s confidential and internal files or even entire databases hosted somewhere in that cluster. Then there’s also the possibility of leveraging it as a ransomware attack by taking the organization’s files hostage.

What You Need to Know

Some people don’t like sulfides, even though the foods that contain them tend to be good for your health. Onions are among them, and the reason we’re talking about foods here in any way is because Siloscape uses the Tor proxy and an .onion domain to anonymously connect to its command and control (C2) server. Knowledge is power when you’re going to defending against a foe, and so we’ll share more about what we know about Siloscape’s operation and what you might be able to be on the lookout for.

Siloscape malware is characterized by these behaviors and techniques:

  • Targets common cloud applications (usually web servers) for initial access, using known vulnerabilities (‘1-days’) and often ones that already have an existing working exploit
  • Uses Windows container escape techniques to get out of it and gain code execution on the underlying node
  • Abusing node’s credentials to spread in the cluster
  • C2 server connection via the IRC protocol over the Tor network
  • Waiting for further commands

It’s very likely that we’ll hear a lot more about this new malware in the coming weeks and months, and with all the recent news of major data hacks in the USA you have to hope that we don’t hear of it in one of those contexts.

A Fix?

Microsoft doesn’t recommend using Windows containers as a security feature, and recommend Hyper-V containers instead for anything that relies on containerization as a security boundary. Processes running in Windows Server containers can be predicted to have the same privileges as admin on the host – the Kubernetes node. If you are running applications that need to be secured in Windows Server containers then Hyper-V containers may be the safer choice.

Managed Hosting – The Pros and Cons

Managing something usually has the context of getting greater productivity out of whatever it is. So if it’s always a plus, does that apply to managed web hosting the same way? The appeal is easy to see, and if you don’t know what managed hosting is it’s where the web hosting provider manages the system for you. As you’d expect, that means additional cost for the client but many businesses and ventures will come to see that as money well spent. This is particularly true if time and manpower aren’t resources you have in abundance and you need to focus more on content rather than workings of the site.

Here at 4GoodHosting we’re like any other quality Canadian web hosting provider in that what we do makes us fairly knowledgeable on matter like this one. While we think managed web hosting is great for some people, we also feel that with a little bit of self-initiative paired with ease of use offered by cPanel standard with our web hosting packages that you can be fairly productive on your own. That likely won’t be true for those operating larger e-commerce websites but for anyone that’s – for example – selling their pottery online or something similar you’d be surprised how proactive you can be.

So that’s what we’re going to do with our entry this week, look at what’s good about managed web hosting and also looking at what’s not so good about it.

Pros of Managed Hosting

Being successful and / or profitable with website can take up a lot of time and effort, and primarily for content updates, design tweaks, and digital marketing activity. When you add ongoing technical maintenance on top of that, it can be overwhelming for some and especially if you’re the type who’s a webmaster in title while not being particularly web savvy in the first place. Everyone needs to start somewhere though.

The advantage of managed hosting is how it lets you focus on the specifics of your business without additional worries of any sort related to site performance or security. E-commerce merchants will always need speedy loading times but often won’t have the time, ability, or inclination to see to ensuring that on their own. Managed hosting is also called fully-serviced hosting and for anyone who’s not inclined or capable of doing the mechanic’s work for website performance it may well be worth what you will pay additionally each month.

Either way, it is true that you need to factor in the value of your time. Many of us don’t have much of it that isn’t spoken for when it comes to our livelihoods, so if that’s you and much of that livelihood is facilitated by a website(s) then managed hosting may be the right fit for you.

Cons of Managed Hosting

Oppositely, if have solid IT skills and see the value in having more immediate and hands-on control over your website then managed hosting may be something you don’t really need or want. Many people don’t want to relinquish control over the fundamentals, and there is some merit to that as while most web hosting providers run reputable operations there is always the chance of a bad actor doing harm.

(Yes, we can assure you that would never happen here)

As you might guess, this is in fact the most common reason people go with manual hosting. Let’s also consider a scenario where you want to make an immediate change but there’s a delay in response from your hosting company and a window of opportunity is lost. Another instance might be wanting to use a particular CMS but not having the needed support for it. Others will not want to have the chance of being in a situation where there’s no choice but to outsource important tasks due to time constraints or anything similar that’s putting you in a pinch.

Keep in mind two other concerns related to managed hosting. The first is cost, and for anyone who thinks changes to their initial setup are very unlikely it doesn’t make sense to continue pay additional for site management through managed hosting. Plus, the greater control that comes with manual hosting allows you to switch to a new host with minimal hassle or anything else if terms or prices change.

The last thing we’ll mention here today is one more advantage of good managed hosting, and specifically with the fact that you’ll have more immediate defensive responses if your site is attacked or a server fails. If you have any interest in migrating your website to a better Canadian web hosting provider then we’d be very happy to hear from you.

From VPS to VDS

We’ve been pretty emphatic with promoting virtual private servers here over the years, and nothing has changed with how they’re a much better choice for websites that need more bandwidth and performance resources at their disposal. That’s not going to be the scenario for the vast majority of people having their websites hosted, and every one of those guys and gals will be just fine paying as little as possible for web hosting by going with a shared hosting plan. It’s especially the best choice for anyone whose site is simply a blog.

One of the things that makes us a top Canadian web hosting provider is the way we are able to pivot and turn with industry trends, and that’s what we’ll be doing with one of the more promising ones these days. VPS continue to be a good option, but now VDS servers are the even better choice for a lot of people who made the move to VPS a while ago. VPS is virtual private server while VDS is virtual dedicated server.

So what’s the difference and what makes VDS better for some? That’s what we’ll look at with this week’s entry.

Bare Metal Alternative for Bigger Boys

Most people that have their website as a primary resource for e-commerce interests are going to operating it on behalf on some business. Not surprising that bigger businesses will have bigger websites and ones that have plenty of size and dynamic content components as compared to what the average website on shared hosting is going to have.

These people needed powerful servers with predictable performance and for many years that meant going with a bare metal server. But now it’s changed to be that a VDS server is a better option when dedicated resources are needed. Previously you would probably rent a bare metal server and have it collocated it in a data center. The problem here is that the server is not going to scalable, and the bare metal servers have always tended to be expensive to maintain.

So Why VDS?

Change became possible when hypervisor technologies started making it so that multiple virtual machines could be run on the same hardware. The completely individual virtual servers would have their own CPU, RAM, Disk storage and more and so before long VPS servers arrived on the scene.

How VPS and VDS are different starts with one or more resources being shared between the virtual machines. With VDS users receive a virtual server with guaranteed resources, including 100% of the CPU which that’s entirely at your disposal and can be spun up as much as you need. RAM and other resources are also attached for you.

It is also superior because these new generation servers deliver the flexibility to upgrade resources instantly at any time. You can also manually bring up or down the number of CPU cores, increase or decrease RAM capacity, or add to or take away from disk storage capacity. With a physical server any of that would be one heck of a chore. You’d be buying parts, installing them on your own, and probably creating a whole lot of downtime for the site.

Who Will VDS Fit Best?

Sites that are not accommodated by the capabilities of conventional hosting, high-load network services are going to be the best fit for VDS, and that will also be true for ones where web masters are actively involved in design, development, and testing of software. Closed corporate projects with increased requirements for security and data confidentiality will also find virtual dedicated servers to be ideal, and smaller big companies that would struggle to buy or lease a physical server will want to look into this alternative too.

Reliability is always a priority, and the dedicated resources of a VDS server makes these more reliable too. Many will offer multiple cores, and being able to tailor RAM is an advantage in this regard too. Last but not least, this type of separate server offers full control over the system. From managing user accounts and installing software to network interfaces configuration and firewall settings.

Supreme Versatility

As-a-Service software is increasingly becoming the norm in the digital work and commerce worlds, and VDS servers join the trends as an example of IaaS (Infrastructure-as-a-Service) and immediately become the most versatile cloud service consumption model. You are able to build any information system you need, install any operating system and software, or add configurations to any network easily.

Last but not least, a VDS server also outdistances a bare metal server with the way they allow for a quick upgrade or downgrade of computing power. VDS servers also make it easier to host and manage applications flexibly.

Stay tuned for new and appealing web hosting options available to customers here at 4GoodHosting.

‘Chaos’ a Means to Foil Hackers with Digital Fingerprints

The past 3 years or so in the world of cybersecurity have really made clear that hackers have expanded their reach and capability in a big way over that time, and it’s fair to say that cyber security interests have struggled to keep pace in protecting digital interests from being hacked into. The single individual doesn’t have as much to worry about when it comes to being hacked as a business or large enterprise does, but that doesn’t mean that they should be unconcerned.

Here at 4GoodHosting we’re like any other good Canadian web hosting provider in that making sure our servers are as secure as possible, but when also know that we’ve got it pretty good in comparison to some others who have way, way more in the way of data that needs to be protected. Researchers have been stepping up their efforts to keep hackers more in check, and that’s a very good thing.

Specifically what’s happened recently is they’ve found a way to use chaos to help create digital fingerprints for electronic devices that may be so thoroughly unique that even the most sophisticated hackers can’t get past them. And that’s based on just the sheer volume of possible combinations that will be possible meaning it would take an incomprehensibly long time to go through and try every one of them.

How long? Well, we’ll get to that as we move further into discussing this very interesting development in web security.

That Long!

It’s believed that these Chaos fingerprints ahave so many layers of unique patterning to them that it would take longer than the lifetime of the universe to test for every possible combination available. Behind all of this is an emerging technology called physically unclonable functions – PUFs – which are built into computer chips.

We’re not quite there yet, but it’s possible that these new PUFs could possibly be used to create super-secure ID cards and reliably track goods in supply chains and as part of authentication applications. Ones where it is vital to know the individual you’re communicating or sharing information with is legit. The recent SolarWinds hack on the US Gov’t has really prompted interest groups to stop being complacent here and find much more reliable cyber security methods and approaches.

The key feature with PUFs here is that there are tiny manufacturing variations found in each computer chip, and they’re ones that are so small they aren’t something the end user is going to notice. Often the variations are only seen at the atomic level, and in industry logo they’re starting to become known as ‘secrets’.

More Secrets Required

The shortcoming with current PUFs is that they only contain a limited number of secrets. Ones with anywhere between 1,000 or up to a million aren’t enough to prevent a hacker from getting it right if they’re got the will to persevere and eventually find their way in. Ones that have the right technology and enough time can figure out all the secrets on the chip.

But now it’s believed that chaos makes it possible to have an uncountably large number of secrets layered on top of each other, with so many of them and such unique detail between them that it’s going to be way too much of a challenge for even the most capable hackers.

What’s Chaos Then?

Right, before we go any further we need to define briefly what ‘chaos’ is when it comes to superconductor chips. A basic definition is when the output of a semiconductor laser and its parameters are tweaked – often by modulating the electric current pumping the laser or by feeding back some of the laser’s light from an external mirror – to make the overall laser output chaotic and unpredictable.

Unpredictable is the key word there, there’s nothing in the way of patterns or logic to be determined, and so the hackers standard effective approaches aren’t effective anymore.

The recent noteworthy developments with all of this have been researchers creating a complex network in their PUFs using a web of randomly interconnected logic gates. By taking two electric signals and using them to create a new signal, a repeating variance pattern is established and the layers become increasingly unique and undecipherable.

This then amplifies the small manufacturing variations found on the chip. Every slight difference amplified by chaos generates an entirely new set of possible outcomes, and the layers then start to come in waves with each new one making it even more difficulty for a hacker to operate successfully.

Just Right

One thing that’s important with this new advance is to know just what’s the right amount of chaos to be implementing. It’s important to have chaos running for the right length of time, too little and you won’t get the security level you’re after, and too long and things then become way too chaotic.

It’s estimated that these new PUFs are capable of creating more than 1000 secrets EACH. That works out to even if a hacker could crack one secret every microsecond it would still take them about 20 billion years to crack every one of them factored into that single microchip.

The protection is far reaching too. Machine learning attacks, including deep learning-based methods and model-based attacks, all failed when put to the tests against these new digital fingerprints. The hope is that PUFs like this could massively enhance security against even the most sophisticated and well-armed hacker attacks that are backed up with a lot of computer resources.

It will be interesting to see how pervasive this cybersecurity technology becomes, but it’s easy to see how it would immediately appeal to Federal Governments in particular. And we can probably safely assume with widespread adoption it will be frustrating more than a few residents in the world’s geographically largest country.

Microsoft and Intel Team Up to Make it Hard for Crypto Miner Crooks

It’s been a while since we had an entry talking about the much-buzzed about topic of cryptocurrency, and it’s not like we’ll never have some new fold on the page with the stuff given how much uncertainty there still is about whether cryptocurrencies are ever going to assume the central role in globally unregulated currency that some people still adamantly insist they will. Given how much actual crypto mining is going on out there and the lengths people are going to in order to get in on the action suggests there’s still plenty of belief out there.

Whether or not cryptocurrencies will be a legit option for paying for SaaS on PaaS products is something that might be of interest to a Canadian web hosting provider, and that definitely applies to us here at 4GoodHosting. Some say the real questions if Bitcoin are any cryptocurrency can ever be generated to the type of volume that would be needed to make it a legitimately exchanged currency. But like was said, the effort is definitely there to ‘obtain’ whatever there is to get out there, and the fact that there’s a criminal element in crypto mining furthers that fact even more.

A tool is always needed, and for these crypto miner crooks their implement of-choice is crypto-jacking malware. It’s like hijacking but it’s not planes being intercepted, it’s cryptocurrency.

But the good news is that tech giants are fighting back and making it so that it’s much more difficult to hijack someone’s else cryptocurrency who’s been mining it legitimately.

Super Defender

To get right down to it, what Microsoft is doing is integrating Intel Threat Detection Technology into Microsoft Defender for Endpoint, and this revamped security product will help protect businesses from crypto-jacking malware. Up until now these crypto miners use only a small fraction of power depending on the device, so they often don’t end up on the radar of security teams.

It’s only more recently that larger sums have been lost to crypto jackers who’ve found ways to do what they do much more effectively and greater reach that it’s become more of a priority for everyone, even though crypto mining can be difficult to detect. Much of that has been due to slow or sluggish machines with bloated software and also because of inferior threat detection and automated upgrades being performed on them.

But again, this has changed and the rise of crypto jacking and the extent it’s taken a bite out of people or organizations has made it so that decision-makers aren’t ignoring it any more. Add the fact that not finding ways to threat crypto jackers means the cryptocurrency mined at these organizations is then used to fund criminal gangs or whoever else that wants ill-gained funds for whatever it is they’re aiming to do.

Better Performance

What these two have done extremely well is executing security tasks but keeping it all in-house within a hardware module. There are major performance advantages to this, and especially with having an identification process that is based on resource utilization that is made MUCH faster than it would be with software-based approaches.

There is also no need to deploy software that might be filled with bugs or potentially come with vulnerabilities. Intel has added a very valuable component with the CPU layer, making it more difficult for crypto jackers to hide their activities. Software solutions would be much more likely to lose the scent, if you know what we mean.

It can identify abnormal behavior that might otherwise be overlooked as normal activity by the malware.

Catching Coin Thieves at the CPU

Intel TDT applies machine learning to low-level hardware telemetry sourced directly from the CPU performance monitoring unit (PMU). What this does is put more of a brighter searchlight on the system to more likely identify the malware code execution and its ‘fingerprint’ at runtime. This is when it’s going to be most on display and ready to be caught by Defender.

Typical obfuscation techniques will make no difference here, and that will also be true even when malware hides within virtualized guests and or doesn’t intrusive techniques like code injection or performing complex hypervisor introspection.

In addition, some machine learning is offloaded to Intel’s integrated graphics processing unit (GPU). And in response to how coin miners make heavy use of repeated mathematical operations – when this activity is recorded by the PMU a signal is triggered when a certain usage threshold point is reached.

The entirety of these machine learning capabilities make it so that the footprint generated by the specific coin mining activity can be identified and recognized. Defender is unaffected by common antimalware evasion techniques such as binary obfuscation or memory-only payloads.

No-Agent Malware Detection

These TDT integrated solution can also expose coin miners using unprotected virtual machines or other containers as spots to hide in. By stopping the virtual machine itself or reporting virtual machine abuse, attacks are prevented AND resources are saved.

This no-agent malware detection means the asset from can be protected from the attacker without having to be in the same OS.

All of these advances are important, because criminal crypto miners are getting better all the time and so security measures that limit their effectiveness need to be improving in step too. One thing that is for sure is as cryptocurrency values continue to rise, crypto-jacking becomes much more attractive to a whole lot of people.

Pandemic Making Mobile Even More Prominent

It’s almost a certainty that in say five to 10 years from now there’s going to be all sorts of conclusions identified about how the COVID years of ’20 and ’21 let to widespread fundamental changes to the lives of humans. Some of that will be good, some of that will be not so good. Some of them are already very apparent, like the explosion in popularity for online shopping. Others can’t be seen as easily and it takes data analysts and the like to dig up those realities and share them with the rest of us.

Which is fine. And leads us to the question – have you noticed your mobile device in your hands and ‘at work’ a whole lot more often since around about a year ago? For some people that may have been the reality long before early 2020, but as a collective whole it would seem that COVID-19 has somewhat subtly made our smartphones a whole lot more integral to our day to day. Here at 4GoodHosting, we imagine we’re like any good Canadian web hosting provider in that we’ve seen more interest in people having mobile-friendly websites.

So what’s behind all of this, and how is that a global pandemic is making mobile even more prominent in the digital lives of earth’s inhabitants?

App usage, purchasing, and time spent on apps have all gone up considerably as the world relied even more on mobile web access to weather the storm. In response to this more and more venture capital has gone into mobile web technology development, and industries that really saw a lot of it were financial services, transportation, commerce, and shopping. And that’s lead to what’s being seen now – 26% of total global VC funding is going into mobile-related ventures and $73 billion was directed into mobile just last year.

Attention and Engagement

There are a lot of different factors that are leading people to be approaching more and more tasks with their handheld device rather than a notebook or desktop. Mobile is increasingly the hub, both at home and at work and it’s a new reality driving considerable changes in how we spend our time. Look no further than recent study results that found our US neighbours now spend 8% more time on mobile devices than they do watching TV.

All around the world many countries have people that spend an average of more than 4 hours a day on their mobile devices. There’s always been an attention economy and now mobile is very much in the centre of it. Again looking to the USA, it’s estimated that people spend anywhere from 16 to 30% of their time using apps on mobile devices, and that’s for everyone from millennials all the way to Gen X and even for Baby Boomers too.

Add further that consumers spent $143 billion on apps – or in them – in 2020. That’s up 20% from the year previous.

Changing Enterprise is Front & Centre

Not surprisingly, Apple and their iOS14 are right on the cutting edge of the new mobile economy. iOS 14 quickly jumped out to higher adoption rates than previous OS generations, and that lead to mobile ad placements growing 95% across the year. Next we add to that the new work-from-home reality that many people now have, and all B2C and B2B communications that come along with that. A similar survey found that business apps grew 275% year over year the 4th quarter of 2020.

What this has created is a need for more consistent data usage and speed, and of course the best of that is coming with 5G. Or so we’ve been told. Then there’s also more and more people using mobile devise for inter-office communication when they’re away from their home workstation, often on apps like Microsoft Teams or Slack.

Communication apps like Twitch and Discord, community-focused apps including Nextdoor, and payment apps are also expanding in leaps and bounds in the space for the same reasons. The interest in better privacy and security for these apps is also pushing investment into mobile.

Industry Transformation

The shift doesn’t end there. Finance, investment, education, and keep-fit apps saw a noticeable use uptick too, and eBook revenues climbed 30%. Digital wallets are increasingly popular for people increasingly comfortable with mobile-based finances, and more and more people are investing through their smartphone too. Stock market activity on mobile climbed 55% globally with Robinhood, Cash App, TD Ameritrade, Yahoo Finance, and Webull Stocks coming in as the top-5 investment apps.

Using iOS within enterprise is also up quite a bit too, and notable here is HSBC embracing Apple’s ecosystem as a means of delivering cutting-edge mobile services that are customer facing.

Last but not least – how many have found you’re ordering in a LOT more than you used to? Food delivery services also saw use increases of more than 100% year-on-year across most markets.

Skipping the dishes fairly regularly these days? Joint the club it seems, and it’s for all the reasons that mobile is increasingly taking the lead given new world realties and expanding user preferences.

8 Ways to Improve cPanel User Security on Your Own: 2 of 2

As promised, with this week’s blog entry we’re continuing with the discussion of what the average person can easily and reliably do to improve the security of their website when managing it through cPanel, which is of course what you’ll be doing if your site is hosted by us. Widespread adoption speaks volumes about the quality and practicality of software, and with that understood it’s not surprising to see that you’d have the same cPanel to work with if you were with pretty much every other good Canadian web hosting provider too.

It continues to be a wild and crazy time in the world as the global pandemic keeps reinventing itself, but fortunately the viruses that infect digitally aren’t nearly as readily transmissible as the physiological one that is turning the whole world upside down right now. So if you’re the Average Joe webmaster ensuring your website is serving whatever interest you have in mind for it then maybe that’s some good news mixed into a whole lot of not-so-good nowadays.

Anyways, let’s get to continuing our list shall we, and here’s to hoping you’re all keeping safe.

Set up Mod Security

Experts are always eager to share warnings about Web Application Security. The aim is usually to reach the web application server to control you. It’s for this reason that a Web Application Firewall (WAF) can enhance cPanel security.

Mod Security gets the nod here as the best choice – it’s reputable and open source. It defends reliably against most external attacks – SQL Injection, Webshell / Backdoor Detection, iFrame attacks, Botnet Attack Detection, or HTTP Denial of Service (DoS) Attacks.
Configuration is simple, and is found in the Security Centre of Easy Apache configuration.
It is simple to configure Mod Security. You will get it under the Security Center of Easy Apache configuration. With it you will also have the option to set additional measures to improve the security of your cPanel server.

Scan with RootKit Hunter

RootKit Hunter gets extremely high marks from cPanel experts. It’s among the best UNIX based tools for scanning possible local exploits including rootkits or backdoors. It offers the option of manual or automatic scans, and really why wouldn’t you choose the automatic scan option? It’s also really straightforward to install too.

Scan System with Maldet

Maldet functions to detect Malware on your server, and it’s already a Linux product so you can trust it’ll fit seamlessly here. It takes primary aim at malicious files that are from PHP backdoors and dark mailers.

Setup Cron Job for Running ClamAV

We gave props to ClamAV last week, and running it with Cron is the best choice. A Cron job is a utility program for repeating tasks at a later time. If you’re the type of person who prefers to ‘set it and forget it’ then installing ClamAV in conjunction with a Cron Job setup for regular scans against viruses or malware then this is a good choice.

Disable Apache Header Information

For whatever reason, Apache Header information tends to be more at risk of malicious attacks than other components in the make up of a site. Hiding information from public access is the quickest and most direct fix here, and here’s how to do it:
• Login to your WHM dashboard
• Go to Service Configuration
• Go to Apache Configuration
• Click on Global Configuration
• Choose ‘Keep Off of Server Signature’ and ‘Select Product Only on Server Tokens’
• Disable Apache Header Information
Hide PHP Version Information
Hiding PHP version information is a quick and easy move you can make, and it’s effective because it prevents would-be attackers from being able to make quick decisions about which entryway is going to be best for them.
Equally easy. Here’s how
• Login into WHM
• Go to PHP Configuration Editor under Service Configuration
• Disable ‘expose_php’ options
• Hide PHP Version Information
Disable FTP and go with SFTP Instead
FTP is the well-established standard, but these days using SFTP is increasingly recommended. The primary reason is because FTP does not use encryption. All data is uploaded as plain text and that aids attackers with identifying access to important information – even including login credentials. SFTP is a better choice as it uses encryption for all types of data.
SFTP is known as SSH File Transfer Protocol because it encrypts the data when using. This is what you’ll do:
• Log in to your WHM or cPanel as admin or root user.
• Go to FTP Server Configuration
• Change the option Required (Command) from Encryption Support
Securing cPanel and WHM access
Adding SSL when logging into cPanel or WHM provides a valuable additional safeguard. By forcing HTTPS to use the URL to access the cPanel or WHM, the connection becomes much more secure.

• Log in to your WHM admin panel
• Go to Home and then Server Configuration and then into Settings
• Next go to the Redirection tab to enable SSL to your server. Reference the images and the red marking point to accurately configure it as you like

Edge Computing & 5G: Perfect Pairing for Apps

5G isn’t commonplace yet, but it’s not too far off from becoming that and we’re already starting to get sample tastes of what it’s capable of. And you likely know someone who already has a 5G smartphone and they’ve probably been all too keen to tell you how it’s so much superior. No doubt they are, but what’s really the big story around 5G is what it’s poised to make specific technologies – and apps among them – capable of. Then when the superior processing capabilities of edge computing are added to the mix we really start ‘cooking with gas’, as the expression goes.

A lot of people are probably currently just as satisfied as can be with apps as it is. Unless you’re a really digitally discerning person you are probably quite fine with what your apps are able to do for you now. If we could fast forward 5 years, however, you’d probably be looking back at your current apps and their capabilities and describing them as totally underwhelming. Here at 4GoodHosting we’re like any good Canadian web hosting service provider in that we can relate to all of this a little more readily than most, and that’s only because the nature of our work puts all of this a lot more on the radar.

So if Edge Computing and 5G are poised to be the perfect dynamic duo that’s going to power web applications to new heights, it makes sense that we dig a lot deeper into this interesting topic.
The Basics

5G promises the ultra low latency and high capacity required to make data-intensive applications work more effectively. That equation starts with blazing fast speeds – 5G enables data travelling from a device to a cell tower and back in just 3 milliseconds. 4G needs 12-15 milliseconds to do that. This is huge for advanced applications. Autonomous vehicles (self-driving cars) are one of the best examples of how 5G’s speed and low latency is opening us up to a whole new world of very life- and society improving technologies. Medical professionals using 5G to get and provide instant diagnoses of patients is another potentially huge development people are excited for.

Current Network Shortcomings

All of the potential of 5G is only going to go as far as the networks that support it though. As has been the case with all cutting edge technologies that have arrived before it, 5G’s arrival into the mainstream has been staggered. Network coverage for it isn’t widespread or common yet, and businesses and developers adopting it are still going through necessary experimentations with it.

What’s holding that back is data. 5G supported applications and services produce a vast amount of data, and as 5G adoption becomes more widespread the data demands for that are growing. The issue is that all the cleaned pipes that transmit the data to and from the cloud and physical data centers don’t have the size or capacity to handle it and keep it moving at sufficiently fast speeds.

After all, part of what 5G is hanging its hat on is ultra-low latency and that has everything to do with what’s called the ‘last hop’. That’s a term given to the transfer from the cell tower to the endpoint device itself. Data being sent from the cell tower to a central cloud data center can still take up to 500 milliseconds, and that’s only to the halfway point before it has to come back.

Long and short here is that as of now the 5G experience can still be far too slow for many people’s liking. But of course that is going to change, and the only question is how soon will it get up to speed to a level that consumers will deem acceptable.
Adding the Edge
Definitions for edge computing will vary, but a safe one is that it’s an architecture characterized by a distributed cloud architecture made up of local micro data centers. Instead of structuring networks around a core with data being continually sent for processing and analysis, edge networks take datan and process it within micro data centers at the ‘edge’ of the network.

What this results in is much less of a need to send data back and forth to a centralized server or cloud, working out to less bandwidth usage and much less latency. In this way it’s key to enhancing the speed and response of the ‘last hop’ we talked about above. In this way edge computing is every integral to 5G’s processing power being experienced the way it’s intended to be.

In comparison to 5G too edge computing is already well established. Look no further than well know initiatives like SyncThink, Molo17 and Doddle. SyncThink in particular is getting a lot of hype these days, with its ability to allow medical professionals to carry out near-instant and entirely accurate injury assessments, even in challenging diagnoses environments.
Good One Way, Good the Other

Edge computing is already fully implemented in the mainstream, and that gives reason to believe that 5G’s widespread adoption is not far behind. And not just because the two enable each other so well. Edge computing enables more organizations to begin experimenting with 5G, and 5G then makes it possible for organizations to get more out of their existing edge deployments.

What this likely will mean for the laypeople like you and I is many new classes of applications with unparalleled resilience, speed, security, and efficiency. That has to sound mighty good to you, and no matter where the bulk of your interest is based in when it comes to taking advantage of digital connectivity to improve your life.