Need for Attack Surface Management for Businesses Online

Reading Time: 4 minutes

Look back across history and you’ll see there has been plenty of empires, but even the longest-lasting of them still eventually came to an end. When we talk about larger businesses operating online and taking advantage of new web-based business technologies no one is going to compare any of them to empires, perhaps with the exception of Google. But to continue on that tangent briefly, there is not better example of an empire failing because it ended being spread to thin quite like the Mongol empire.

The reason we mention it as our segue here to this week’s blog topic is because nowadays as businesses expand in the digital space they naturally assume more of a surface, or what you might call the ‘expanse’ of their business in Cyberspace to the extent they’ve wanted / needed to move it there. With all that expansion comes greater risk of cyber-attacks, and that leads us right into discussing attack surface management. So what is that exactly? Let us explain.

An attack surface is every asset that an organization has facing the Internet that may be exploited as entry points in a cyber-attack. They could be anything from websites, subdomains, hardware, applications, to clod resources or IP addresses. Social media accounts or even vendor infrastructures can also be a part of the ‘vulnerabilities’ based on the size of your surface.

All of which will be of interest to us here at 4GoodHosting as quality Canadian web hosting providers given how web hosting is very much an abutment for these businesses with the way it’s a part of the foundation for their online presence. So let’s dig further into this topic as it relates to cloud security for businesses.

Rapid Expansions

We only touched on the possibility for an attack surface above. They are rapidly expanding and can now include any IT asset connected to the internet, so we can add IoT devices, Kubernetes clusters, and cloud platforms to the list of potential spots where threat actors could infiltrate and initiate an attack. Having external network vulnerabilities creating an environment that can prompt a potential breach is an issue too.

It’s for these reasons that attack surface management is a bit of a new buzzword in cyber security circles, and those tasked with keeping businesses’ digital assets secure likely have already become very familiar with it. The key is in first identifying all external assets with the aim to discover vulnerabilities or exposures before threats do. There is also a priority on vulnerabilities based on risk so that remediation efforts can focus on the most critical exposures.

Logically then, attack surface managements needs to be based on continuous, ongoing reviews of potential vulnerabilities as new, more sophisticated threats emerge and attack surfaces expand. It’s interesting that term was being bandied about early as 2014, but it is only recent developments and trends that have made it put more at the forefront for cyber security than before.

6 Primaries

Here are the trends in business nowadays that are enhancing the risk posed by having expanded attack surfaces.

  1. Hybrid Work – Facilitating remote work inherently creates an environment where companies are more dependent on technology while less affected by an limitations based on location. But the benefits are accompanied by an expanded attack surface and the potential for increased exposures.
  2. Cloud Computing – The speed and enthusiasm with which businesses have adopted cloud computing has also spread out the attack surface at a speed that cyber security platforms haven’t been able to keep up with. This frequently results in technical debt or insecure configurations.
  3. Shadow IT – It is quite common now for employees now to be using their own devices and services to work with company data as needed, and how ‘shadow IT’ expands attack surface risks is fairly self-explanatory.
  4. Connected Devices – Internet-connected devices have exploded in numbers over recent years, and their related implementation in business environments has created a new variance with attack surfaces at high risk. One that’s directly connected to the insecurity of many IoT devices.
  5. Digital Transformation – The way companies are digitizing as broadly, deeply, and quickly as possible to stay competitive means they’re at the same time creating new attack surface while layers, plus altering the layers that already exist.
  6. Development Expectations – Always launching new features and products is an expectation for many businesses, and this has factored into how quickly technologies will go to market. There is pressure to meet these demands, and that pressure may lead to new lines of code being hastily written. Again, fairly self-explanatory with relation to growing attack surfaces.

The attack surface has become significantly more widespread and more difficult to keep contained as organizations grow their IT infrastructure. Plus this growth will often occur despite resource shortages that come at an unideal time with a record-breaking 146 billion cyber threats reported for 2022 and likely much of the same when this year is tallied up.

It’s for all these reasons that attack surface management is even more of a priority for organizations as they take on key challenges with the frontline of cybersecurity.

New Optical Data Transmission World Record of 1.8 Petabit per Second Established

Reading Time: 3 minutes

Speed has been and always will be the name of the game when it comes to data transmission as part of utilizing web-based technologies. That is true for the smallest of them in the same way it is for the biggest, and it’s not going to come as a surprise that with the advances in those technologies comes a need to handle much more data, and handle the increased volume of it faster at the same time. Add the fact that global network use continues to grow explosively all the time and there’s a lot of gain – both functional and financial – to be had for those who can introduce means of moving data faster.

And that’s just what’s happened courtesy of Danish and Swedish researchers who have succeeded in setting a new benchmark fastest speed for optical data transmission. Many of you will be aware of what a Tbps (Terabits-per-second) speed score would indicate with regards to fastness in this context, but if you’ve heard of a Petabit in the same one then consider us impressed. It’s been nearly 3 years since the previous data transmission speed record was set, and you’re entirely excused if you’d only heard of a Terabit back then.

The score 178 Tbps set in August 2020 was quite remarkable for the time, but not anymore. And it certainly would have been for us here at 4GoodHosting in the same way it would have been for any good Canadian web hosting provider based on the fact that we have a roundabout way of associating with this based on the parameters of what people do with the sites we may be hosting for them. But enough about that for now, let’s get right to discussing the new data transmission world speed record and what I may mean for all of us.

Doubling Current Global Internet Traffic Speeds

This mammoth jump in speed capacity is made possible by a novel technique that leverages a single laser and a single, custom-designed optical chip that makes throughputs of 1.8Pbps (Petabits per second) possible. That works out to double today’s global internet traffic, and that highlights just how much of a game changer this has the potential to be.

The 2020 record speed we talked about earlier is only around 10% of today’s maximum throughput announcement. This equates to improving the technology tenfold in less than 3 years. A proprietary optical chip is playing a big part in this too. The workings of it are having an input from a single infrared laser creating a spectrum of many colors, and each color representing a frequency that resembles the teeth of a comb.

Each is perfectly and equally distinguishable from one another and mimics the human process with which we distinguish colors, detecting the different frequencies of light materials as they are reflected in our direction. But because there is a set separate distance between each it makes it so that information can be transmitted across each of these frequencies. Greater varieties of colors, frequencies, and channels means that ever greater volumes of data can be sent.

High Optical Power / Broad Bandwidth

The current optical technology we have now would need around 1,000 different lasers to produce the same amount of wavelengths capable of transmitting all of this information. The issue with that is that each additional laser adds to the amount of energy required. Further, it also means multiplying the number of failure points and making the setup more difficult to manage.

That is one of the final research hurdles that needs to be overcome before this data transmission technology can be considered more ideal. But the combination of high optical power and being designed to cover a broad bandwidth within the spectral region is immediately a huge accomplishment and one that couldn’t arrive sooner given the way big data is increasingly a reality in our world.

For now let’s just look forward to seeing all that may become possible with Petabit speeds like this that are capable of handling about 100x the flow of traffic currently accommodated as a maximum by today’s internet.

Extensive Inter-Device Trading by 2030 with Burgeoning Economy of Things

Reading Time: 3 minutes

Anyone who’s a keener with the all the workings of the digital world will be familiar with the acronym IoT and the fact it stands for the Internet of Things. We’ll go ahead and assume you’re one of them if you’re a regular here and reading our blog entries with any degree of regularity. It never takes long for major technological infrastructure developments to branch off into subsets of the original, and that’s exactly with the newfound EoT – Economy of Things – and the way this new level of interconnectivity is set to revolutionize the online business world is quite something.

Much of what makes this practical and appealing at the same time is that people will always want to have straightest distance between two points available to them when it comes to completing transactions for services / goods obtained. The IoT certainly has much more of a relation to services than goods, but when you think about all of the services we take advantage of on a regular basis it makes sense that implementing web-based tech to simplify transaction processes is going to be to most people’s liking.

And this makes sense to us here at 4GoodHosting in the way it would for any good Canadian web hosting provider too. We are likely to see a lot of inroads made in between the EoT and SaaS and PaaS services / products as well, and we are certainly just seeing the tip of the iceberg with where this is going to go and how far-reaching this is going to be. So we’re going to look at how much inter-device trading is expected over the course of the remainder of this decade here with this blog entry.

Intersections for Data and Money

The expectation is that by the end of the 2020s, around 3.3 billion Internet-connected devices (IoT devices) will be taking money and data and trading directly between them. The same calculations that came to that are saying that for 2024 the number will be 88 million devices participating in EoT and so it puts into perspective how much this is going to grow explosively over the next 6 years.

It actually trends to be something in the vicinity of a 3700+% increase over that time, and how it relates to people who have their business online is that many will need to be evaluating their digital frameworks themselves to see if they’re positioned to be able to accommodate the change. Real-time global digital markets will need to be available for indexing, searching, and trading and set up to be as accommodating as possible for EoT transactions.

EoT as a subset of IoT will likely dictate that more than 10% of the overall IoT market will experience a compound annual growth rate that is well above 50%, and some economists are saying this has the potential to be the ‘liquefaction of the physical world’. IoT devices will be able to share their digital assets autonomously via IoT marketplaces.

A good example of this can be seen in the growth of electric vehicles that are rightly being promoted by governments in the face of manmade climate change. Vehicles will need charging, and the EoT will be a means of fast-tracking those transactions between smart vehicles and the charging stations. Further, data connected by vehicles will be valuable for others in the ecosystem. Connected vehicles could communicate and coordinate with charging points, parking space sensors, and traffic lights, and do so directly via EoT.

Huge Jump in Smart Grid Devices

These same industry experts are also foreseeing more than 1.2 billion EoT-enabled smart grid devices to be in place and in operation by 2030, which makes up around 40% of the total opportunity forecast. Some 700 million supply chain devices will be located alongside the smart grid devices. AI-powered tools will be able to analyze IoT-generated data and use the data to anticipate surges in demand for energy, plus selling spare capacity back to the grid when the opposite situation is the case.

Notably here Vodafone built an EoT platform last year called Digital Asset Broker in anticipation in the huge growth of EoT-connected devices over the next 6 years. Other companies also have an eye on the rising EoT sector include banks and financial organizations. There are so many opportunities with EoT and anticipating it to progress fast in its first stages means that being as well prepared in advance with relation to infrastructure makes a lot of sense.

Higher than Anticipated Cloud Costs the Norm for Businesses

Reading Time: 2 minutes

Over the past 10 years the digital expanse for businesses has become nearly all encompassing. Without going into too much detail that’s the primary factor as to why so many businesses have had to embrace the cloud as – quite simply – their operations are necessitating it. That goes beyond data too, and companies that are utilizing the IoT are especially in need of everything cloud computing makes possible.

Now there are many instances in life where something can be described as both a blessing and curse. That probably doesn’t legitimately apply to cloud computing, but for some of these businesses it may be at least somewhat applicable based on all the added costs they have had no choice to assume because of moving more and more into the cloud. Damned if you do, damned if you don’t might well apply here too.

That doesn’t so much apply to us here at 4GoodHosting in the same way it won’t for any good Canadian web hosting provider. But there is likely enough of the audience here that is making the move to a greater extent at all times too, and so we’ll make the expensiveness of cloud computing for businesses our topic for this week’s entry here.

Higher than Expected Costs

Even though most businesses are happy with the rate at which their company is transforming to the cloud, higher-than-expected costs are making some businesses revisit how they are allocating needed monies for it. All of this based on a Cloud Impact Study where senior IT professionals across large organizations in the US, Canada, and UK were asked about their organization’s spending on cloud infrastructure.

36% of the respondents reported that delivering cost predictability is one of the key challenges facing their organization. And the overarching belief is that while Cloud cost is high, the benefits don’t line up with how much is being spent on it. In addition, ‘many companies are not getting the benefit of a comprehensive management strategy they expected.

The adjoining belief is that much of this stems from having rushed into cloud service adoption due to the way the pandemic nudged many businesses online. The popularity of multicloud is a factor here too, with 59% of respondents saying their organizations prefer to combine public and private cloud services.

Mitigated Challenges = Lower costs

There are plenty of reasons why diversifying across more than one provider is appealing, and a big one is in hoping to save some costs by cherry-picking only the parts they need from different service providers. But even in best-case scenarios it seems that higher-than-expected cloud costs continue to face them.

Any and all will know that embracing digital transformation is important, but more could be done to mitigate the challenges that face businesses to at least some extent. Investing in Cloud essential components for business continuity and growth in turbulent times is 100% worth it for all these companies, even if the cost of doing so is unappetizing . Increasing ITDM knowledge and understanding or employing a multicloud specialist provide will go a long way to improving the cost-benefit ratio, especially as the period of economic uncertainty continues to be a detriment to businesses worldwide.

Benefits of Website Isolation for Shared Hosting

Reading Time: 3 minutes

There have been plenty of entries here over the years where we’ve talked about the many benefits of VPS web hosting, and for those who truly need to have a private server it’s not even a consideration to go with a more affordable shared web hosting package. Websites that need elbow room to do what they do when pages load are often a big part of a businesses online presence, so while it would be nice to have a less expensive web hosting arrangement it’s just not possible to do with anything less than VPS hosting.

With that understood, anyone with a website that doesn’t need to be accommodated to such an extent can and will be fine with shared web hosting most of the time. And there are smaller and less dynamic e-commerce websites that have been on shared hosting plans for years. There is more in the way of valid concerns around the security and integrity of websites hosted in shared environments though, and that’s always going to be the case. Website isolation can be a partial fix for that though, and that’s what we’re going to go over here with this week’s entry.

It’s a topic that is going to be noteworthy for any Canadian web hosting provider, and here at 4GoodHosting we are the same in the way we put an emphasis on providing valuable information for anyone who has a website for their business, venture, or personal interest and want to be proactive in ensuring that everything stays uneventful with their site and that it’s always problem free alongside the information superhighway.

Isolation = Better Security

Businesses, bloggers, and web developers often rely on shared hosting environments based on their affordability and the way general management is less complicated. But security is nearly always a challenge in these environments. Website isolation can be part of a solution to reducing risks, and there are 5 primary advantages for putting website isolation in place.

Better Overall Security

Being more vulnerable to security breaches is natural with shared hosting when multiple websites share the same server resources. Implementing isolation techniques helps contain the impact of a security breach, making it more likely that a compromised website does not affect others on the same server. Maintaining the overall security and integrity of all websites hosted on the shared server is always going to be important for businesses and website owners alike.

Effective Resource Management

Shared hosting environments will have all websites sharing the server’s resources, with CPU, memory, and bandwidth split between them. Maintaining the stability of the entire server is dependent on each website operating independently. Proper resource allocation can prevent resource-intensive websites from causing performance issues or crashes that may well affect other websites on the server.

Safeguarding Privacy and Data

Users are only going to fully trust in the protection of their privacy if blocking unauthorized access to sensitive data like user information and database content can be relied on based on the web host’s infrastructure and measures in place. Techniques that restrict access to authorized users only play a key role in safeguarding the integrity of websites on shared servers, and this is particularly important for sites that handle sensitive customer data or payment information.

 

Regulation Compliance

Many industries have General Data Protection Regulation (GDPR) and the Health Insurance Portability and Accountability Act (HIPAA) as a requirement. Having isolation measures in place often goes a long way for meeting these regulations and their requirements as it helps block unauthorized access to sensitive data.

Protecting Website Performance

Having consistent performance levels for all websites hosted on a shared server can be a challenge, but it’s a necessity if a host is going to legitimately offer it to web hosting customers. Techniques that ensure each website operates independently can help prevent resource-intensive websites from monopolizing server resources, leading to improved performance across all hosted websites.

Isolation techniques for shared hosting websites do much for maintaining security, privacy, and performance in shared hosting environments. They allow for better performance metrics with the way that each website and user account operates independently. This contributes to protecting the integrity of websites on shared servers.

Anyone considering a shared web hosting provider will do well to look into any isolation features they offer, and it’s good to ask about what measures they have in place to guarantee the safety and integrity of your online presence.

Digging into Deep Learning

Reading Time: 4 minutes

Anything in life that requires learning in order to have functional ability is going to also have a hands-on experiential component if you’re truly going to become adept at whatever it is. There is value in learning how to do it, but usually the most integral part of the learning comes from actually being at ground level and doing it. These days AI is the buzzword connected to nearly everything in the digital world, and that world is where an increasingly large part of everything we have and make use of has its roots.

We’ve talked about machine learning before, and it’s a concept that’s fairly well understood within the parameters of how advanced computing has made advances in leaps and bound over recent years. As the saying goes, you certainly can’t stop progress and the type of deep learning that people are talking about with relation to supercomputers and the like builds on machine learning and improves on it with a whole lot of experiential learning.

It is a profound subject, and more so for anyone or a business where much of what you do is being constantly reshaped by computing technology. That certainly applies to us here at 4GoodHosting in the same way it would for any reputable Canadian web hosting provider, and as AI continues its meteoric rise to prominence and increasing dominance in the space it makes sense for us to have a look at Deep Learning and how it contributes to influentially to supercomputing.

Data Abundance

Deep learning is an advanced artificial intelligence technique who’s usefulness is in the way it has abundant data and increased computing power. Online language translation, automated face-tagging in social media, smart replies in your email, and the new wave of generative models are among the many reasons it’s main technology behind many of the applications we use every day.

Everyone will have heard of all the buzz around the ChatGPT AI-powered chatbot and the way it is revolutionizing pretty much everything, and part of its strength is in the it can differentiate images from text descriptions, exampling how deep-learning systems that model the relation between images and text descriptions have massive potential for ‘smarter’ and more intuitive computing.

As mentioned, deep learning is a subset of machine learning but of the newer types where classic, rule-based AI systems, machine learning algorithms that use training to develop their behavior by processing annotated examples is much less relevant now. Instead, that relevance has now shifted to deep learning and neural networks.

Classic machine-learning algorithms that solve many problems that rule-based programs don’t do well with dealing with soft data such as video, images, sound files, and unstructured text. The best example of this is with predictive modelling, where the contributions domain experts, computer programmers, and mathematicians aren’t need to anywhere near the same extent anymore. What takes the place of all that is an artificial neural network.

Human Brain Inspiration

Deep-learning algorithms tackle the same queries and challenges using deep neural networks. It is a type of software architecture that actually takes a lot of its framework from the human brain. Neural networks are layers upon layers of variables, and they adjust themselves based on what they can recognize in the properties of the data they are trained to work with. This makes them capable of doing tasks like classifying images and converting speech to text.

Neural networks are especially capable with going over unstructured data and finding common patterns in it, and in recent years the greater availability and affordability of storage, data, and computing resources have solidified neural networks as one of the pinnacles of AI innovation.

Applications

Computer Vision

The science of having software make sense of content with images and video is referred to as computer vision. Deep learning has made extensive progress here, and one of the best use-case examples is with cancer detection. This type of deep learning is also well-established in many of the applications people use every day. Take Apple’s Face ID, for example – it uses computer vision to recognize your face, the same way Google Photos does for various features like searching for objects and scenes as well as for image correction.

Voice and Speech Recognition

Most people have taken advantage of Siri or Alexa at some point, and deep-learning algorithms are very much at work here too as they play a role in how your voice is converted into text commands. Google’s keyboard app is Gboard, and it uses deep learning to deliver on-device, real-time speech transcription that types as you speak.

Natural Language Processing and Generation

Natural language processing (NLP) is the science of extracting the meaning from unstructured text, and developers of classic software have been looking for a workable fix to that challenge for decades. It’s not possible to define ALL the different nuances and hidden meanings of written language with computer rules, but neural networks trained on large bodies of text can execute many NLP tasks with accurate results.

Google translate is another popular resource for people, and it also experienced a boost in performance when Deep Learning was incorporated into it. Smart speakers use deep-learning NLP to understand the various nuances contained in spoken commands, and basic examples of this can be when asking for weather forecasts or directions.

Benefits of Domain Privacy for Business Websites

Reading Time: 3 minutes

It’s fortunate that these days greater number of people or businesses who are online are more aware of the need to be smart about when and where you make phone number and email addresses available. Phishing scams are likely if you’re negligent about where your contact details are accessible, and in worst case scenarios identity theft or even having your website hijacked can happen.

 

Anyone who knows anything about acting will be familiar with what a stand-in does for an actor while on set, and they’re very valuable that way. The way domain privacy works is kind of like that it is the actor’s stand-in that it’s a service that replaces the individual’s contact information with that of the service provider. Doing this secures you against anyone who might be trying to access your details with the aim of initiating fraud or malevolent practice.

 

Business owners investing in expensive websites will want to consider getting the domain privacy add-on when building their website or having it built for them. Keep in mind as well that contact information doesn’t need to be publicly displayed on your website to have it still be available. The aim needs to be with making sure it doesn’t end up in WHOIS records and this is a way to protect you from spambots that exclusively check WHOIS data.

 

This option is going to make a lot of sense for man individuals or businesses, and the growing need for expanding web security practices makes domain privacy a topic that any good Canadian web hosting provider is going to take an interest in and that’s true for us here at 4GoodHosting too. So this is what we’ll look at with our blog entry this week.

 

Who’s Name?

 

The way domain privacy functions starts with how domain names are registered. That’s something we are explicitly familiar with and most internet domain names found in use on the Internet are registered through an organization called ICANN – Internet Corporation for Assigned Names and Numbers. Domain owners must provide their name and contact information when purchasing a new domain name and the information is then listed on a database that has the name WHOIS.

 

Once it is catalogued in the directory it becomes accessible to anyone on the internet via the many free domain lookup tools. Anyone with an internet connection can find one easily and use it. Domain privacy makes it possible to hide your contact information from the WHOIS directory and make a random email address and the registrar’s own contact details available instead of your own.

 

We should as well that there are legitimate reasons why ICANN makes it a requirement that every website owner put their contact details on display publicly. One of the bigger ones i the way it makes it easier for law enforcement agencies to track people in case there’s illegal activity on their websites. Helping lawyers hold website owners accountable in cases involving plagiarism and copyright infringement is important too.

 

It also makes it possible for people who are interested in buying your domain to contact, although those aren’t always harmless communications either. But the bad of it can be with people trying to send you marketing spam over email or phone. It is true that hackers have used contact information to hijack websites, and putting site addressees in line to be targets you for phishing scams happens very regularly.

 

Look to EU’s GDPR

 

Hackers and spammers often have dedicated bots at work crawling through WHOIS directories in order to generate lists of contact details and then pick and choose the ones where they think they see the most promise with whatever it is they’re aiming to do. When contact details don’t show up on the WHOIS records it’s much more likely a business will evade the attention of such systems and steer themselves clear of any of these types of problems.

 

Here in North America we might want to look at what has happened with the European Union’s General Data Protection Regulation and how it relates to better protections for businesses that need to make contact information available. This new set of data regulations aimed at protecting your internet privacy is going far with this and it’s creating legislation resulting in everyone protected under EU law will see their contact details redacted in WHOIS listings across the internet.

 

Apparently seeing ‘redacted for privacy’ rather than the actual contact information is what people will see. It may be wise for policy makers here in North America to do the same thing to offer people those same types of assurances when they’re doing business online.

 

Reddit Now Looking to get Paid for Role in A.I. Development

Reading Time: 4 minutes

Gleaning insight from conversation with others is what has made Reddit the hit it has been in the digital space, and if you have any area of interest you can always find at least one sub-Reddit on it, and usually you’ll have more than a few to look through. It’s one of the most engaging social networking sites if you like to ingest your information by consuming text and the simplicity of it is really nice too. 57 million people visit Reddit every day to chat and broaden their horizons with whatever subject it is they’re interest in, and it’s a great resource for that.

Looking at it from a different angle, you might be surprised to learn that Reddit chats have also served as a free teaching aid for companies like Google, Microsoft and – most notably these days – OpenAI. Reddit conversations have been used in the development of giant artificial intelligence systems created by these 3 and we’re seeing how they’re already becoming such a big deal in the tech industry.

Definitely a development that everyone in our industry will take note of as well given the connection, and here at 4GoodHosting we’re like any other quality Canadian web hosting provider in that seeing an online chat forum become an open-source type of asset for these big players in the digital world is definitely something we’ll be keen to follow as well as share here with our blog.

Charging for Access

And it is definitely interesting to see that Reddit wants to be paid for accessing its application programming interface now. That’s the method through which outside entities can download and process the social network’s vast selection of person-to-person conversations, and the heads of the company at Reddit don’t think they should be making that value available for free.

So what we’ll see now is Reddit charging for access to the conversations it hosts for the purpose of developing A.I. systems like ChatGPT, OpenAI’s popular program. Reddit is also acting as it prepares for a possible initial public offering on Wall Street this year. As of now it makes most of its money through advertising and e-commerce transactions on its platform, but right now the discussion is ongoing around what they will be charging for A.P.I. access.

The New Bots

Here are the other chatbots powered by artificial intelligence that have been utilizing Reddit as a development resource, and here they are:

ChatGPT – the MUCH talked about artificial intelligence language model from a research lab, OpenAI. Able to respond to complex questions, write poetry, generate code, translate languages or even plan vacations. GPT-4 introduced in mid-March is even able to respond to images

Bing – Microsoft has a similar chatbot, capable of having open-ended text conversations based around virtually any topic but apparently does have occasionally inaccurate, misleading and weird responses that put it in a bit of a negative light in comparison when it first came out.

Bard – This is Google’s Chatbot and originally conceived as a creative tool designed to draft emails and poems. Notable for it is the way it is able to generate ideas, answer questions with facts or opinions, or write blog posts entirely on its own.

Ernie – Here is the lesser light one for sure, but Chinese search giant Baidu came out with their rival to ChatGPT in March. The name is a shortened version for Enhanced Representation through Knowledge Integration, but it hasn’t made anything near the splash that these other three have.

LLMs

L.L.M.s are sophisticated algorithms companies like Google and OpenAI have developed, and the algorithms are able to take Reddit conversations are data, and then adding that data to the vast pool of material being fed into the L.L.M.s. to develop them.

Other types of companies are seeing value in Reddit conversations and also for what they have with images. Most people will know Shutterstock, an image hosting service. It sold image data to OpenAI to help create the A.I. program that creates vivid graphical imagery with only a text-based prompt required – DALL-E.

Artificial intelligence makers need two significant things to ensure their models continue to improve. The first is a staggeringly large amount of computing power, and the second is an enormous amount of data. Many of these A.I. developer major players have plenty of computing power but still need to go outside their own networks for the data needed to improve their algorithms.

Other sources like Reddit that are utilized are Wikipedia, millions of digitized books, and academic articles, and Reddit has had a long-standing symbiotic relationship with the search engines of companies like Google and Microsoft. Their search engines have been crawling Reddit’s web pages in order to index information and make it available for search results.

With LLMs though the dynamic is different as they obtain as much data as they can to create new A.I. systems like the chatbots. Reddit’s A.P.I. is still be going to be free to developers who wanted to build applications that helped people use Reddit and there’s also an aim to incorporate more so-called machine learning into how the site operates. One possible benefit of that is that it might identify the use of A.I.-generated text on Reddit and labelling it as that.

Leveraging Private Edge Computing and Networking Services for Better Scaled VPNs

Reading Time: 3 minutes

Fairly common for data storage and management needs to have outgrown what you originally set up for their accommodation and giving them the elbow room they need. All sorts of possibilities and variations on and around what that outgrowing might mean for a person or organization depending on what they do and how extensive their data needs have become. For a long time now the default suggestion for any in such a situation would be to move to a Virtual Private Network (VPN).

But here we are again collectively struggling to keep up with changing needs and realities, and if we were to list out all the potential explanations as to why a VPN wouldn’t be quite cutting it like it used to for people then we’d have an entire blog entry of its own. But VPNs are so well entrenched as a critical enabling tool for today’s distributed organizations and internet users. Roughly one-third of all internet users now use a VPN to protect personal data, and that’s a number that’s going to get the attention of us here at 4GoodHosting in the same way it would for any good Canadian web hosting provider.

Then there’s the fact that there’s plenty ready to push this trend even further, especially with rampant cybercrime and privacy concerns likely to be front and center in the coming years. The pressure this puts on VPN providers is to offer reliable ways for this surging demand to be quickly, efficiently, and cost-effectively accommodated. And the need is even more acute in high-growth emerging markets which offer massive growth potential – Indonesia, China, Thailand, India, and the UAE to name the most notable ones.

The most recent and popular industry consensus is that the best way to do this is to leverage private edge computing and networking services as a means of scaling VPNs more ideally, and that’s what we’re going to look at with this week’s blog entry.

Difficult, but Doable

Let’s start with what makes this difficult. Heavy regulatory barriers, lacking infrastructure, gaps in connectivity, and expensive operating costs means reaching customers in these markets can prove to be challenging. The entirety of scaling a VPN service is difficult too, and much of that is because until now there’s only really been two approaches to doing that – either horizontally or vertically.

When you scale up vertically it is almost always necessary to upgrade servers by replacing them. Expensive? Absolutely, and prohibitively so for a lot of the organizations that would need to eat those costs. But having optimal performance per server is a must, and so if you’re going to scale up vertically these high hardware replacement costs are pretty much unavoidable.

Scaling out horizontally presents its own set of reasons for decision makers to be dissuaded. Scaling out horizontally by adding more servers to your current infrastructure to accommodate peak user loads is expensive and time consuming. Putting together a private high-performing global network that is capable of spanning geographical distances can seem like a daunting task with how long it will take and how much it will likely cost. This is making no mention of the additional maintenance costs which add to the expensiveness.

Private Edge Solution

Having infrastructure providers that offer global, private edge computing and networking services is what’s needed, but who has the means of stepping up and doing what’s necessary to make that it available for those who need it. Another option exists for VPN providers that don’t find cost efficiencies in scaling horizontally or vertically.

That’s to work with a 3rd-party infrastructure enabler that has private, high-quality compute and networking services at the edge of the network available. The key part of this being at the edge is the way it would be relatively close to end users in strategic markets. That eliminates the distance problem from the equation, and by outsourcing network and computer operations these providers can instantly scale into global markets and serve new VPN customers.

Immediate benefits:

  • Improved performance with more ensured performance and stability in overseas markets
  • Reduction in long distance data transmissions resulting in faster data transfers and much less in the way of performance issues (latency / jitter)
  • Better security stemming from 3rd-party infrastructure providers being able to grant access to premium bare metal and virtual machines (VM) for enhanced VPN security and scaling more safely
  • Less maintenance due to avoiding more-constricted VPN services where many servers spread out across multiple locations
  • Lower operating costs as by outsourcing operations you are able to leverage flexible pricing models and pay less for the bandwidth you need

Last but not least, aggregate bandwidth pricing makes it more possible for you to evaluate the balance between underutilized and overutilized servers. You are then able to reduce bandwidth waste and make the most of your bandwidth spend.

Minimizing Loss and Latency with Cloud-Based Apps

Reading Time: 4 minutes

Be it right or wrong, being accommodating and understanding of something or someone only occurs if basic expectations are still being met. Most of you who’d be reading this blog in the first place will know what a bounce rate is, and even though we might not know it we all have an inner clock that dictates how long we’ll be willing to wait for a page to load.

Page loads and page speeds are different, though, but all of this just highlights what’s already well known in the digital world. There’s only so much waiting a person can be expected to so, and so this has lead to efforts to minimize loss and latency with cloud-based apps.

The success they’ve had with doing that is what we’ll talk about with our blog entry here this week. Cloud-based technology has been integral to how many of the newest apps have the impressive functionality they do, and even if you’re not the savviest person to it you are probably benefiting it in ways you’re not even aware of based on that mini PC or Mac masquerading as a ‘phone’ in your pocket.

Having so many developers catering to public cloud IaaS platforms like AWS and Azure, and PaaS and SaaS solutions too, is made possible by the simplicity of consuming the services. At least to some extent when you are able to connect securely over the public internet and start spinning up resources.

This is something that shows up on the horizons for good Canadian web hosting providers like us here at 4GoodHosting, as it’s definitely within our sphere.

So let’s have a look at what’s known with the best ways to minimize loss and latency with cloud-based apps.

VPN And Go

The default starting point for any challenge that needs to be addressed or choice that needs to be made is to use the internet to connect to the enterprise’s virtual private clouds (VPC) or their equivalent from company data centers, branches, or other clouds. And preferably with a VPN, but doing so doesn’t guarantee an absence of problems for modern applications that depend on lots of network communications among different services and microservices.

Quite often the people using those applications can run into problems with performance, and more often than not it’s related to latency and packet loss. That’s logical enough to make that connection, but there’s more to it. Specifically with their magnitude and variability. Loss and latency problems will be a bigger deal for internet links than across internal networks. Loss results in more retransmits for TCP applications or artifacts due to missing packets for UDP applications, and too much latency will mean slower response to requests.

If that’s the scenario and there are service or microservice call across the network then this is where loss and latency are most going to hamper performance and take away from user satisfaction in a big way. Values that might be tolerable when there’s only a handful of back-and-forths can become wholly intolerable when there are now exponential values of them given how much application architecture is in place.

Varying Experiences

More variability with latency (jitter) and packet loss on internet connections improves the chance that any given user gets a widely varying application experience. One that may be great, or absolutely terrible and everywhere in between. That unpredictability is as big an issue as the slow responses or glitchy video or audio for some users some of the time.

3 specific cloud-based resources come to the forefront as solutions to these problems; direct connection, exchanges, and cloud networking.

A dedicated connection to the cloud is the first one we’ll look at. This is where the customer’s private network is directly connected to the cloud provider’s network. This will usually involve placing a customer switch or router in a meet-me facility. The cloud service provider’s network-edge infrastructure then connects them with a cable so packets can travel directly from the client network to the cloud network. And there’s no need to traversing the Internet.

The only potential hangup is with WAN latency. But as long as the meet-me is acceptable, performance should be comparable to an inside-to-inside connection. If there’s a potential downside it’s probably with how direct connects are expensive compared to simple internet connectivity. They also tend to only come in large-denomination bandwidths only. Finding something smaller than 1Gbps is unlikely.

Multiple CSPs with Exchanges

Big pipes are always an advantage and that’s true for any type of context you can use the term. Content service providers (CSP) with big pipes are able to take large physical connections and separate them into smaller virtual connections at a broad range of bandwidths under 100Mbps. Making a single direct physical connection to the exchange is beneficial for the enterprise user, and any identified means of making a virtual direct connections over it to reach multiple CSPs through the exchange is preferable.

The next consideration here is for internet-based exchanges that maintain direct connects to CSPs, but still leave customers free to connect to the exchange over the internet. The provider typically offers more in the way of onloading locations plus a wide network of points of presence at its edge. This makes it so that customer traffic doesn’t need to be moving around the internet before making the important exit into the private network and without experiencing latency and loss.