AI Is Becoming Speedier

reading time Reading Time: 5 minutes

Artificial Intelligence has been a work long in progress, but in recent years we are definitely starting to see it begin to make more of a mark. The capacities of it were never in question, but the speed with which those capacities could be undertaken were sometimes. Not that performance speeds are always necessary with AI-related tasks, but sometimes their speediness is definitely of the essence. For example AI is earmarked for an extensive role in healthcare in the future and that’s one area where both accuracy and quick results are definitely going to be required.

Recent AI performance testing results came out over a little more than a month back, and what they’ve shown is that AI is getting faster, and that’s something that is great news for the increasingly digital modern world. The future of it is something that is of great interest to us here at 4GoodHosting, and in the same way it would be for any Canadian web hosting provider who enjoys having an eye on the future and what it will entail for IT and all that grows along with it.

This measuring of general AI performance follows the first official set of benchmarks much earlier and lays out 350 measurements of energy efficiency. Most systems measured improved by between 5-30% from previous testing, and some were more than 2x better than their previous performance stats.

So what is the significance of this, and what will it mean in relation to development? That’s what we’ll look at here this week.

6-Way Performance Testing

This testing involved systems made up of combinations of CPUs and GPUs or other accelerator chips being tested on six different neural networks and performing an array of common functions—image classification, speech recognition, object detection, 3D medical imaging, processing of natural language, and ability to make logical recommendation. Computers meant to work onsite instead of in the data center had their measurements made in the offline state to recreate their receiving a single stream of data to measure against least ideal pathway instances.

When it comes to AI accelerator chips used in the tested machines, most notable were software improvements to them that were promoting an up to 50% improvement in performance. Typically this was for 1 or 2 CPUs plus as many as 8 accelerators. Of all the ones tested, the Nvidia A100 accelerator chips tested best and showed the most potential.

Multi-Instance GPUs Show Huge Promise

Nvidia has also created a splash with a new software technique called multi-instance GPU (MiG), which allows a single GPU to assume the roles of seven separate chips from the point of view of software. Tests that had all six benchmarks running simultaneously plus an extra instance of object detection came back with solid results that were 95% of the single-instance value.

It should be noted here thought that supercomputer testing doesn’t usually lend itself to conventional result categorizing, and the only part of them that really does is efficiency testing. Testing that is based on inferences per second per watt for the offline component. There is much that was revealed regarding the tests based on this metric, but what’s probably more valuable here is to make light of the new industry benchmark for this performance, which is the new TCPx-AI benchmark and based on:

  • Ability to generate and process large volumes of data
  • Training pre-processed data to produce realistic machine learning models
  • Accuracy with conducting insights for real-world scenarios based on generated models
  • Scalability for large, distributed configurations
  • Level of flexibility for configuration changes to meet changing AI landscape demands

Accurate Data, On the Fly

The new TPCx-AI puts the priority on real, genuine, and accurate data that can be reliably generated on the fly. It seems very likely that this new benchmark will be quickly adopted as the gauge by which AI processing speeds and the data produced are evaluated. Having this data generated upon request and with some speediness in getting that data is going to be huge pretty much right across the board.

Deep learning neural networks and GPU hardware are going to play a big role in all of this too, and natural language processing is going to be a must too if AI is going to be convertible in the way it needs to be serve people in other parts of the world too. There’s reason for optimism these days here too, as an exponentially increasing number of highly accurate captions have been written purely by AI. They’re generated in milliseconds and delivered directly to customers without domain involvement.

All of this has dramatically increased the speed, quality, and scalability of alert captions – the exact type of data language that is most important much of the time when it comes to applications for artificial intelligence in improving the quality of life for people in the future.

You may also like: