Use cases


Machine learning

Machine learning is an area of artificial intelligence that efficiently automates the construction of an analytical model and can be used, for example, to support decision-making or to improve the efficiency of production processes. Machine learning algorithms identify patterns in data and help data scientists solve problems. Machine learning algorithms can be used to predict values, identify unusual events, define structures, and create categories.

In machine learning, algorithms are used to identify data models, and these models can then be used to generate predictions and forecasts. Machine learning algorithms can be used to produce estimates, identify unusual events, define structures, and create categories. Machine learning can, for example, utilise customer-related data and identify behavioural models for use in optimizing product recommendations and providing the best possible customer experience.

Computational fluid dynamics

Computational fluid dynamics (CFD) deals with simulations of gas and liquid flows. CFD utilises high-performance computing with the aim of producing a precise numerical solution for equations describing the movement of a liquid or gas. CFD is used in basic research on fluid dynamics, designing complex flow configurations, and predicting interactions between chemical substances. Mathematical modelling of liquid flows can be applied to a wide range of research and design problems across different industries and research sectors, including aerodynamics, space/aerospace research, weather simulations, and engine and combustion analysis.

Information management and analytics services

Our supercomputers have plenty of storage space, and their high processing capacity makes it possible to turn data into meaningful information, forecasts and recommendations. CSC has expertise in distributed data processing, data integration and data analytics. We can provide impartial advice on the best commercial and open source solutions available.


Genomics is transitioning from research-driven activities to activities led by health care organisations. It has been predicted that health care providers will be producing the majority of genomic data within the near future. In order to maximise the value of genomic data, data must be shareable between organisations and across national borders. Research is strongly dependent on computational efficiency, and our understanding of the genome has been identified as a major factor in health care development.

Natural language processing (NLP)

"Natural language processing (NLP) is a subfield of linguistics, computer science, and artificial intelligence concerned with the interactions between artificial intelligence and human language. In particular, the aim is to automate processing and analysing large amounts of natural language data. The goal is an AI capable of "understanding" the contents of documents, including the contextual nuances of the language within them. The technology can then accurately extract information and insights contained in the documents, as well as categorise and organise the documents themselves.

Frequent challenges in natural language processing involve speech recognition, natural-language understanding, and natural-language generation."

Quantum Computing

Quantum computers are revolutionising research and development that utilises computational modelling. Due to a completely new way of computing, quantum computers can solve some problems much more efficiently than classical supercomputers, already in the near future. In order to take advantage of the computing power of quantum computers, problems must first be reformulated in a form that is amenable for quantum computing, however. Particularly suitable applications for quantum computers are, e.g., financial modelling, risk analysis, machine learning, molecular and materials sciences, as well as logistics and optimisation problems in general. Together with VTT and Aalto University, CSC maintains and develops the Finnish Quantum-Computing Infrastructure FiQCI. FiQCI harnesses the combined power of supercomputers and quantum computers, and provides our customers with a development and testing platform for quantum computing.

Drug research

Drug development is a long process in which comprehensive computational work can be used to focus expensive experimental phases on the molecules that are most likely to be effective. Atomic models can be used to study how medicinal substances bind with and effect the target molecules or to test different hypotheses. Pharmaceutical research has been utilising modelling in the different research phases for a long time already, but the capacity offered by supercomputers makes it possible to analyse enormous quantities of molecules at an unprecedented rate. In addition, the most sophisticated methods already produce accurate results from a smaller selected group of molecules at a faster rate than laboratory tests – and without the chemical waste.

The chemical industry

The details of several processes of industrial importance can be modelled using atomic-level methods. These include molecular dynamics and density functional theory. Simulations can be used to determine, for example, the operating mechanisms of catalysts, the properties of mixtures and solutions, and how these behave on surfaces. This understanding can be used to create new, more efficient catalysts or to optimise process conditions. The power of supercomputers makes it possible to build and use models that have significance in the real world.

Our team at your service

Pekka Uusitalo

tel. +358 50 042 7720

Dan Still

Partnerships manager
tel. +358 50 381 9037

Juhani Huttunen

Customer solution manager
tel. +358 40 581 1138