
Image: Horacio González-Vélez
For NCI’s Prof Horacio González-Vélez, the challenge of computational science is building systems that are not only powerful, but keep up with the changing needs of science, industry and society.
“Computing has become the invisible engine powering modern society,” says Prof Horacio González-Vélez.
“From climate modelling and medical research to financial analytics and digital communications, vast amounts of data must be processed, analysed and acted upon in real time.
“But none of this would be possible without the relentless evolution of high-performance computing (HPC) – the field dedicated to making computations faster, more efficient and scalable.
“My career has been shaped by this challenge: how do we build computing systems that are not just powerful, but also adaptable, accessible and sustainable?”
González-Vélez is professor of computer systems and head of the Cloud Competency Centre at the National College of Ireland (NCI), where he leads a multidisciplinary research group focused on HPC, cloud computing, advanced digital skills and large-scale data processing.
He completed undergraduate and postgraduate degrees in computer science in Mexico, Japan and the UK, and received his PhD in informatics from the University of Edinburgh. He also spent 10 years in the tech industry, including roles at Silicon Graphics and Sun Microsystems.
“This industry perspective has been invaluable in my academic leadership, where I have worked to bridge the gap between cutting-edge research and practical applications.”
González-Vélez says that his research has evolved over the years to keep “in step with the changing needs of science, industry and society”, all with the aim of “ensuring that the power of computation continues to solve real challenges, transform industries and prepare future generations for the digital age”.
Tell us about your current research.
The rapid transformation of high-performance computing (HPC), cloud infrastructures and distributed systems is reshaping the digital landscape. The ability to efficiently manage large-scale computational workloads, optimise data-intensive applications and develop sustainable computing strategies is no longer confined to research laboratories or supercomputing centres. It is a crucial enabler of economic growth, workforce transformation and societal advancement. My research has evolved in response to these changes, focusing on how HPC principles can be applied to real-world challenges, ensuring that computing power is accessible, scalable and impactful.
At the core of my work is a commitment to advancing digital skills, enhancing computational efficiency and integrating sustainability into cloud and HPC architectures.
The increasing complexity of modern computing environments necessitates not just technical expertise but also an adaptive workforce, capable of leveraging parallel and distributed computing techniques in a way that is practical and efficient.
To this end, I coordinate Digital4Business, a €20m EU-funded initiative that is designing a highly scalable master’s programme in advanced digital skills. This programme is not just about theoretical knowledge; it provides hands-on experience in digital transformation, cloud architectures, data science and AI, ensuring that professionals and organisations can effectively harness the computational solutions for business innovation and industrial applications.
However, computational power alone is not enough; it must be secure, reliable and sustainable. The rise of data-driven decision-making, cybersecurity threats and the need for digital sovereignty has placed new demands on how we manage and protect computational resources. Within Digital4Security, our group collaborates with researchers and industry partners to deliver adequate educational programmes that prioritise security without compromising efficiency.
The sustainability dimension of computing is equally critical. While digital transformation promises enhanced productivity and efficiency, it also raises concerns about energy consumption, carbon footprints and the long-term environmental impact of large-scale computing infrastructures.
Our work in Digital4Sustainability tackles these challenges head on by developing skills for energy-efficient computing. By integrating green computing principles into digital transformation strategies, this research helps industries and governments balance computational power with environmental responsibility.
Beyond enterprise and industry applications, computational literacy and accessibility remain key challenges. It is not enough to build powerful computing infrastructures; we must also democratise access to HPC-driven technologies. This is where SMARCO and Code4Europe come into play.
These initiatives focus on empowering communities, businesses and young professionals with the skills needed to engage with advanced digital systems. Whether it is helping public administrations adopt scalable digital workflows or providing hands-on training in cloud computing and parallel processing, these projects are shaping the next generation of computationally skilled professionals.
Taken together, my research portfolio represents a holistic approach to high-performance computing in the modern era – one that integrates digital skills development, computational efficiency, security and sustainability into a unified vision. The future of data-intensive computing is not just about scaling performance; it is about ensuring that HPC remains a tool for innovation, resilience and responsible technological progress.
In your opinion, why is your research important?
The role of parallel computing and HPC in scientific discovery and industry transformation cannot be overstated. My work seeks to ensure that HPC remains a cornerstone of European and global digital strategies, reinforcing its impact in computational sciences, engineering, finance and public sector innovation.
I firmly believe that HPC and distributed computing will continue to shape the next era of digital transformation, providing robust and scalable solutions to challenges in sustainability, cybersecurity and data-driven governance.
What inspired you to become a researcher?
Throughout my career, I have had the privilege of interacting with and learning from some of the pioneers of computational sciences. Two individuals who have had a profound influence on my research path are Prof Jeffrey Ullman and Prof Jack Dongarra.
A seminal figure in computational theory and database systems, Jeffrey Ullman’s work on algorithms and data structures has deeply influenced my approach to parallel computing models and scalable systems. Meeting him at NCI during the opening of the Cloud Competency Centre was a significant moment. His textbooks on compiler, algorithms, automata theory and databases were my faithful companions during my undergraduate studies.
One of the most influential figures in high-performance computing, Jack Dongarra’s contributions to numerical linear algebra, benchmarking and supercomputing architectures have been instrumental in shaping modern parallel computing techniques. His pioneering work on the LINPACK benchmark and scalable parallel processing continues to inspire my own research, particularly in the area of optimising cloud-based HPC infrastructures for real-world applications.
Their insights have shaped my perspective on how theoretical advancements in computing must be coupled with real-world applicability, ensuring that HPC remains a key driver of computational efficiency and innovation.
What are some of the biggest challenges or misconceptions you face as a researcher in your field?
Despite its immense potential, HPC faces several misconceptions and challenges:
HPC is only for large supercomputing centres: While supercomputing facilities play a crucial role, HPC methodologies are increasingly being applied to cloud and edge computing, making them accessible to a broader range of applications.
HPC and cloud computing are separate domains: A common misconception is that cloud computing and HPC serve different purposes, whereas in reality, modern HPC workflows increasingly rely on cloud-based architectures.
HPC is only relevant to scientific research: HPC is fundamental to industries beyond science, including finance, healthcare, energy and cybersecurity, where high-speed computing enables real-time analytics and decision-making.
The complexity of HPC is a barrier to wider adoption: While parallel computing and distributed architectures require technical expertise, advancements in user-friendly platforms and education initiatives are making HPC more accessible than ever before.
Do you think public engagement with science and data has changed in recent years?
The past decade has seen an increased appreciation for the role of HPC in everyday applications. The Covid-19 pandemic, in particular, highlighted the need for computational models, large-scale simulations and data-driven decision-making.
This shift presents a unique opportunity for researchers to engage more proactively with policymakers, educators and industry leaders, ensuring that HPC continues to drive technological and societal advancements.
Personally, I always introduce myself as a computational scientist with a marketing twist, as I truly believe it is important to engage with the general public and disseminate findings.
Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of need-to-know sci-tech news.