There’s no doubt that artificial intelligence (AI) is transformational in advancing the way we work. AI is everywhere because everything we do generates data. It’s been estimated that a single internet user produces 146,880 megabytes every day and according to Statista, the 120 zettabytes generated in 2023 is expected to increase by over 150% in 2025, hitting 181 zettabytes.

 

It’s a simple fact that AI requires data – big data – to mature. At the same time, accessing and analysing these colossal datasets requires massive amounts of computational power and memory. Dr. Eng Lim Goh, HPE’s SVP and CTO for AI, explains it best: “From data, a machine learns to be artificially intelligent. AI is the newest frontier of data – it is what drives the edge to be intelligent and the cloud to be hybrid. Together with analytics, AI is becoming the new tool to extract value from data for actionable insights.”

 

One of the ways businesses are already using AI is by combining it with traditional modelling and simulation. From healthcare to agriculture, retail and manufacturing, AI-enhanced simulations are helping every industry to predict potential outcomes, devise innovative solutions and make smarter decisions. The next step is training and tuning models faster so they can continuously adapt and learn autonomously, revolutionising the way industries operate and innovate.

 

AI language models and generative AI, in particular, need to process data at scale. In addition, the data must be available, clean, and compliant with existing regulations. A large language model (LLM) that generates text works one word at a time yet manages to produce coherent, human-like text. To do this, an LLM uses something similar to a map where every piece of data comes together in order to create output that makes sense. At the same time, there is a machine learning (ML) model working in the background, gathering data from various sources like the internet. It’s a process that can take months and involves constantly accumulating information from different sources to ensure the AI stays relevant. Even on a supercomputer with high bandwidth storage the task of processing, indexing and updating datasets can be a formidable challenge.

 

A great example is OpenAI’s conversational bot, ChatGPT. According to a University of California study, it took 1,287 MWh as well as 700,000 litres of clean fresh water in the data centre to train GPT-3, the LLM OpenAI released back in 2020. 1,287 MWh is enough energy to power more than 100 homes for over a year which has resulted in many conversations about the environmental cost of running such a big model. Ultimately, AI’s energy use isn’t sustainable if data is processed in a way that creates a mountain of technical debt (GPT-3’s training was done on 1024 GPUs and cost $4.6M in compute alone) and increases your businesses carbon footprint.

 

Which is where HPE comes in – by helping businesses to deploy and maintain AI at scale in a way that’s more efficient. In a blog post, HPE’s president and CEO wrote that “AI will create superpowers for those who tap into it, fuelling tremendous advancements for business and society, but with these great superpowers comes great responsibility,”

 

HPE uses a holistic approach that prioritises sustainability from the infrastructure and software, to where the models are trained and deployed, and how they are powered and cooled with renewable energy. Also, the faster the supercomputer, the more efficiently an AI model can run and HPE Cray supercomputers rank six times in the GREEN500 list’s top 10 when compared to 500 of the most energy-efficient supercomputers in the world.

 

Are you ready to reap the benefits of AI for your business? HPE has the tools and technology to build AI into your existing business strategy, turn questions into discovery, insights into action, and imagination into reality.

 

Curious about how Axiz and HPE can help your organisation unlock the power of data?

Contact us to learn more: [email protected]

Leave a Reply