From Supercomputing to AI Markus Stoehr


09.09.2024 | PDF

From Supercomputing to AI: How EuroCC Supports Businesses


auf Deutsch

Supercomputing is rarely associated with Austria, yet the country plays a crucial role in high-performance computing for scientists and businesses. Markus Stöhr, head of EuroCC Austria, the national competence centre for supercomputing, big data, and artificial intelligence, explains in our interview how supercomputers and AI are revolutionising the daily life of science and business. A development that comes without much fuss – typical Austrian understatement.

Interview by Bettina Benesch

Markus, you have been working in high-performance computing (HPC) for 20 years. What is this technology used for?


Traditionally, HPC has been used to run simulations for various problems that require processing large amounts of data simultaneously. For example, if I'm building a skyscraper, I can determine how the steel structure should be designed to remain stable. HPC is also needed in aircraft construction to find out how a component should be shaped to optimise airflow.

You can also simulate electronic components, such as transistors or processors, to measure how much heat is generated in the housing when current flows. Weather and climate forecasting is also typically done on high-performance computers. Until now, HPC has been a domain of science; with EuroCC, we are taking a step further to make supercomputing more accessible to startups and small and medium-sized enterprises (SMEs).


How do you do that in practice?
 

We offer a variety of support services. One of them is HPC access. For companies that have already adopted HPC but need more computing power, we facilitate access to systems in Austria and Europe, familiarise these companies with the computers, and help along the way. Then there are companies for which HPC is entirely new. For them we provide training or individual consultations to show what is possible and give further support depending on their needs.

Until now, HPC has been a domain of science; with EuroCC, we are taking a step further to make supercomputing more accessible to startups and SMEs.

A big topic right now is artificial intelligence (AI): many companies and startups are dealing with it. They have an idea and a basic understanding of their problem and how to solve it. But eventually, they get to a stage where they need to scale, for example, when handling multiple projects simultaneously or serving more customers.

 

Artificial intelligence is a broad field with hundreds of applications. Do you have a specific focus?


Our main focus right now is on health and bioinformatics, as Vienna is a hotspot for medical research and development. Additionally, we are investing resources into supporting projects from the area of law. This involves extracting relevant information from an overwhelming amount of legal texts or court rulings. Here, we are in the realm of large language models (LLM), a booming field at the moment.

Traditional HPC remains a big topic, too, and we receive many inquiries from companies, for example, for weather simulations or finite element simulations used to calculate material strength.


How do companies usually get on board with HPC and AI at EuroCC?
 

The first step for companies is to contact us per e-mail. Then we determine where the company stands, what they want to compute, and how much computing power is needed. We then offer to conduct a test project on the Austrian flagship supercomputer, the Vienna Scientific Cluster (VSC). This means we set up access and support the company in using the computer.

Usually, an HPC expert works with the company and is the point of contact for questions. This sets us apart from cloud providers, who typically only provide access and nothing more. Additionally, VSC experts regularly offer introductory courses explaining how to use the system.

 

Every company receives individual support from our HPC experts. This sets us apart from cloud providers, who typically only provide access and nothing more.

You mentioned that it makes sense to switch to HPC when reaching a certain amount of data. What computing power are we talking about with the VSC systems, and when does one need a supercomputer?
 

A laptop, for example, has one processor and usually 30 to 60 gigabytes of RAM, some even less, maybe just eight. Our systems on the latest high-performance computer VSC-5 have 128 cores, each with 512 gigabytes of RAM. So, even when using just one of the approximately 800 computing nodes of the VSC-5, we already have multiple times the performance and resources of a normal workstation PC.

Of course, there are also servers that any company can install at their own site. The cloud is also an alternative. For most applications, that may be enough. But when it gets tight, you need a supercomputer.


What does supercomputing cost?
 

At VSC, there is a difference between academic and commercial use. This is because the VSC is funded with public money. Researchers at the participating universities can use the systems for free. Other universities and companies use VSC on a pay-per-use basis. On the VSC-5, companies pay around four to ten euros per node per hour for computing time; for companies that work with EuroCC, computing power is provided free of charge for test projects or proof of concepts.


How do you think HPC will change in the coming years?


On the hardware side, every new system will contain many GPUs, i.e., graphics cards, and thus become more powerful. GPUs will be particularly needed for AI applications because they are optimal for training neural networks and similar data-intensive tasks.

AI will be increasingly used by scientists who hardly needed computers before.


More computing power, more data, even faster?


Yes. The traditional HPC domains, such as materials science, engineering, physics, chemistry and bioinformatics, will continue to use HPC because simulations are usually cheaper than experiments in labs. In addition, AI applications are emerging. So, there is a significant change in terms of user groups and application fields. We are seeing more researchers from non-traditional HPC areas, like linguists, who want to distill certain information from a large number of publications.

HPC is heading towards machine learning. Artificial intelligence will certainly keep us busy in the coming years, and many people from scientific disciplines that have hardly worked with computers before will move in this direction.
 

How is data protection handled at EuroCC?


Currently, we have two options: first, if desired, we can guarantee that there is no backup of the data. Nothing is copied anywhere. Secondly, each user has individual permissions, so others do not have access. Greater isolation is not possible at the moment. However, from 2025 on, we will have a new high-performance computing cluster, MUSICA (Multi-Site Computer Austria), where security will be even higher. Theoretically, it will also be possible to encrypt the data, which will likely negatively impact performance.
 

EuroCC was established in 2020. What has improved since you started?


We have been able to significantly expand the training programme on HPC and artificial intelligence. There are many new courses available to both academic and commercial users from across Europe, some of which are free of charge and all are in high demand. In the domain of academia, Austria has been doing well in HPC for a long time. With the EuroCC competence centre, the main goal now is to help companies implement their ideas with state-of-the-art technology. Access to high-performance computing through EuroCC is now much easier for SMEs and startups.


Short bio

Markus Stöhr has been involved in high-performance computing since 2004 and has been working for the BOKU university at the supercomputing centre Vienna Scientific Cluster (VSC) since 2011. He completed his doctoral thesis on the VSC-1, a very old generation of supercomputers. Today, the VSC-5 offers 65 times more computing power. Markus has been in charge of EuroCC Austria, the national HPC competence center, since 2020, making access to high-performance computing, AI, and big data easier, especially for small and medium-sized enterprises and startups.


About the key concepts