Connectivity, software, data: Modern sensors and optronics are highly digitalized systems. Artificial intelligence (AI) is playing an increasingly important role in their performance – just as it does for weapons systems as a whole. This is a development that is being closely followed in society and politics and often raises questions. Celia Pelaz, Chief Strategy Officer and member of the HENSOLDT Management Board, and Yvonne Hofstetter, cofounder and Chief Executive Officer of “21strategies”, a company specializing in particularly demanding AI applications, can answer these. A conversation on the potentials and limits of artificial intelligence, Europe’s digital sovereignty, and collaborative innovation in the defence sector.
Ms. Hofstetter, you develop AI for investment funds, for hedging exchange rate and commodity risks – and for military applications: How did this unusual portfolio evolve?
Yvonne Hofstetter: In all the cases mentioned, humans are exposed to a complex dynamic environment, even if decisions on the financial and commodity markets cannot be compared in their scope to those in defence. In real time, humans are challenged to make trade-offs under uncertainty, based on ambiguous information in a highly volatile environment. Next-generation AI gives them more confidence in this regard. For defence systems, this raises serious societal, legal, and ethical issues. Defence is where our roots lie, and where we have returned to by founding 21strategies. After all, we developed our early AI technologies in the military research labs of the late 1990s.
Ms. Pelaz, to what extent is AI already being used at HENSOLDT today?
Celia Pelaz: We have been dealing with technologies that are today incorporated under this term for many years – even at the time when HENSOLDT was not yet an independent company. With AI, we can no longer differentiate ourselves from the competition only by how well a sensor solution perceives a situation. Today, we are increasingly differentiating ourselves by how intelligently the sensor processes what it perceives, interprets it, analyzes it, and, amid all this, how it can build information from data.
And what does that mean in concrete terms?
Celia Pelaz: AI first helps a radar or optronics device to increase detection performance, for example, when image stability leaves something to be desired. Then AI assists in the task of correctly classifying objects – for instance, as a bird or as a drone – and tracking them accordingly. At the next level, the object data and contextual knowledge are combined via AI in such a way that tactically relevant information is generated from them, such as whether the object is an enemy platform. Here, AI is also important in what is known as multisensor data fusion – when it comes to combining and analyzing data from an increasing number of sensor sources distributed across networked platforms. This also increasingly applies to publicly available data from the Internet, so-called open-source intelligence. All of this results in a comprehensive, consistent, and up-to-date situation report. AI thus relieves the soldier, supporting them with options for action so that they can make the right decisions. And AI enables systems such as radars or jammers to learn for themselves and adapt to unknown situations. This is what we call cognitive systems.
Let’s look at the defence industry as a whole. What role will AI play in tomorrow’s security?
Celia Pelaz: A very central one. The ability to extract relevant information from data is becoming an increasingly important factor in determining whether someone is superior or inferior in a conflict. It’s a tiny needle in a huge haystack. Modern defence applications produce so much data that we very quickly reach the point of human overload. In the public debate, we often discuss AI as a potential source of error. In reality, the error rate of AI, especially in routine tasks, is lower than that of humans. AI is acting with increasingly more performance. It is therefore becoming relevant at more and more levels of weapon systems. At HENSOLDT, we recently bundled our competencies in a central AI hub to more closely integrate expertise from the various domains and projects. And we are thus facilitating collaboration with partners, because AI is also a driver for more cooperation: There are many highly interesting players with extremely specialized know‑how at times. So the potential that AI holds for our industry is enormous. All the more reason for me to call for a realistic, responsible AI debate. Some of what is being suggested out there in the market is too much hype for me instead of serious innovation.
In what way?
Celia Pelaz: Giving the impression that AI is the solution to all problems simply misses the reality. Even the momentum we see in deep learning and neural networks does not relieve us of the task of developing and funding other, innovative core technologies for the defence of security and freedom. Realistically, AI is an important lever for achieving higher levels of performance for defence systems. But the technological foundation – like the one we are laying with leading sensor and optronic solutions – remains essential.
Yvonne Hofstetter: I can only emphasize that. AI is nothing more than a toolbox of mathematical theories and information techniques. And above all, there is no one artificial intelligence: Depending on the problem you are facing, you have to select certain techniques from this toolbox and use them for the specific purpose. Machine learning, for example, is only the best solution in certain cases. For many problems, especially all those where you don’t have to estimate anything at all, direct calculations are much better suited, much more accurate, and above all comprehensible.
Celia Pelaz: And that is precisely why, as a defence solutions provider, we must also master the full range of AI, either through our own capabilities or partnerships. The basis is always our application know‑how – the deep knowledge of our customers’ mission requirements and doctrines.
HENSOLDT and “21strategies” are jointly researching third-wave AI. What is that?
Yvonne Hofstetter: The third wave of AI revolves around training the tactical behavior of machines, among other things. Instead of simply processing mass data, such as that generated by radars, a machine selects the decision from a large number of possible options in order to achieve a specific goal. Such machines are already known from somewhere else – from the gaming community. There, intelligent machines have the task of defeating humans in games like Starcraft. Battlegrounds are much more complex than the most complicated game. In our joint “GhostPlay” project, we are investigating which tactics intelligent machines develop in the battlefield and what human soldiers can learn from them. To do this, we are modeling a digital twin of the battlefield and available sensors and effectors with their physical properties. We then present these models to the tech stack and pit AI against AI. One example is so-called SEAD missions with the goal of taking out air defence systems. Here we observe, on the one hand, how AI manages the tactical interplay of the individual components of the defence system and, on the other hand, guides the sensor-effector network of the attacking swarm of unmanned systems.
Celia Pelaz: For HENSOLDT, early and open-ended engagement with such AI-based decision-making processes is a logical consequence. After all, we have long since moved beyond developing the five senses and are increasingly developing the central nervous system of defence applications. With edge computing, data-driven intelligence is moving even closer to sensors and is already often embedded in them. Using integrated systems to perceive external impressions, process them, and convert them into reactions is our core business today.
“21strategies” was founded in 2020 by Professor Yvonne Hofstetter together with mathematician and AI expert Dr. Christian Brandlhuber and information theorist and philosopher Dr. Scott Muller in Freising near Munich. It transforms how organizations make tactical and strategic decisions under uncertainty. To this end, the company is delivering third-wave AI, the future gold standard for tactical versatility with artificial intelligence.
“21strategies” and HENSOLDT are collaborating on the “GhostPlay” project to research the next generation of artificial intelligence for defence applications. The research project, commissioned by Helmut Schmidt University, is funded by Zentrum für Digitalisierungs- und Technologieforschung der Bundeswehr (DTEC.Bw). In addition, HENSOLDT has entered a strategic cooperation with “21strategies” and holds a minority stake in the AI company.
The topic of autonomy is precisely what triggers great reservations about AI among the general public.
Celia Pelaz: We must also conduct the discussion in this direction in a factual and nuanced manner. The “killer robots” often cited in the argument against AI have little to do with reality. The first question to be asked here is basically banal: Are we really talking about autonomous systems or systems remotely controlled by humans after all? Or about AI that primarily processes, fuses, and analyzes data? If we are dealing with autonomy in the narrower sense, the crucial question is that of what we call the “system of interest.” What is the intention behind the action? How does the system that I want to enhance with AI work? In which situation and in which context? Based on this, we should define what AI may and may not do. In critical applications, the principle of “human-in-the-loop” must always apply. In other words, humans remain responsible and make the final decision; AI supports them.
Yvonne Hofstetter: Just as it is not ethical per se to act as soon as people are involved, technology is not unethical per se. Values can certainly be integrated into technology. The only question is how and which values we prioritize and translate into system functions. “GhostPlay” takes an important step here. It is the first AI system in Germany designed for military users that follows the new standard for value-based technology 1.
What does this standard do in concrete terms?
Yvonne Hofstetter: It describes process steps and test criteria for the development of value-driven technology – without itself prescribing a specific set of values. Many legal and ethical demands have already been made on AI, for example by NATO or the EU. But how do I get from these imprecise claims to a technology that implements values? This is where the IEEE 7000TM‑2021 standard, introduced at the end of 2021 by the World Electrical Engineers Association, comes in. It is the first standard that addresses technology but calls for ethics. For the first time, engineers are called upon to follow a standard to translate values into technology. For this purpose, IEEE 7000TM‑2021 even introduces a new profession, the so-called “Value Leads.” They are trained in ethics and have to channel a “system of interest” through the standardized process, measure it against ethical criteria, and ensure appropriate technical precautions. ISO standardization has already followed suit. That’s another reason why I think it’s fundamentally important for tech companies to build up expertise in this area.
Celia Pelaz: I see this as a great opportunity for us Europeans in particular. In the energy sector, we are currently experiencing very painfully what it means to become geopolitically dependent. We must do everything we can to ensure that we do not experience a similar development in the tech sector. A central key to this is social acceptance. And we can only achieve this if we can transparently explain how we anchor our moral compass in our technologies. From a purely technological point of view, we in Europe are in a really good position in global competition in many areas, including AI. But we often encounter social reservations that inhibit innovation and lead to technologies being regulated before they have even been developed. Ultimately, the question is whether we will succeed in establishing a sovereign digital infrastructure in Europe.
What else is needed for a digitally sovereign Europe?
Yvonne Hofstetter: First of all, the political will. In this millennium, Europe has deliberately dispensed with digital sovereignty and made itself comfortable as a free rider of Silicon Valley. We let the Americans do their thing, imported their values along with the algorithms, and failed to build up or maintain our own capabilities in many areas. For example, in search engine algorithms or cloud infrastructures, I now consider it unrealistic that Europe will be able to come up to par in the foreseeable future. We need to find our gap to fill with other issues. In my view, we in Europe are strong in the development of concepts, for example. That is, in complex digital solutions that are specifically designed to meet the particular requirements and needs of a specific sector or area of application, such as defence. In this respect, we are also further ahead in our approach than many IT companies from Silicon Valley, which rely solely on knowledge gained from data and believe that experts are no longer needed.
Celia Pelaz: We must confidently and resolutely define which capabilities we in Europe absolutely want and need to master ourselves. That is the central first step. Perhaps we as Europe cannot achieve digital sovereignty everywhere, but we cannot be dependent on others in core areas. Because we don’t know exactly what we’re buying as a black box, or whether we’ll even get it tomorrow. This is precisely why we need to develop AI for our defence in Europe ourselves, for example. It is a key technology for Europe’s digital sovereignty, and we must ensure that it is in line with our values and thus finds social acceptance. This also requires implementation strength – otherwise concepts will remain just concepts in the end. We need a strong innovation ecosystem, especially in the defence sector.
What are the challenges for an innovative defence sector? How could such an innovation ecosystem of established players and startups develop better?
Yvonne Hofstetter: This is not only, but also a question of money. In recent years, the German defence sector has suffered from image problems. Money has not been invested in building up the capabilities of the Bundeswehr, but has been handed out to society as a peace dividend. But the smaller the defence budget, the more the few industry players can secure that budget against direct competition. Today, we are confronted with a largely closed sector in which new players are hardly able to gain a foothold. As a result, research institutes, many of which are doing excellent work, have little incentive to spin off companies. Innovations thus remain stuck as studies in the institutes and do not transform into products, such that some research investments are not really effectively invested.
Celia Pelaz: In addition to financial resources, we can also achieve a great deal by changing processes, especially in procurement. Significantly faster and simpler procurement cycles would not only ensure that innovations reach customers more quickly. In this way, we would also prevent startups from having to hold out for years until their work can pay off economically, possibly being crushed by bureaucracy beforehand. In our industry, the public sector will always be the most important customer. In this respect, investments in disruptive technologies will always remain a gamble for young companies with the uncertainty of whether they will be commissioned by the public sector. Public-private partnerships are the right way to provide planning security and incentives for innovation. Without NASA, SpaceX would not even exist! We can learn a lot from such examples when we look beyond our own backyard.
Where do you see role models?
Yvonne Hofstetter: When it comes to building bridges to the startup world, I think the concept of DARPA as an agency of the U.S. Department of Defense is very interesting. It awards contracts solely on the basis of technological innovation, without regard to the person. If something is technologically groundbreaking, then a contract can go to a one-man company. This has worked very successfully for decades.
Celia Pelaz: With the DIANA Accelerator, NATO recently sent out the right signal, and the same applies at another level to the Bundeswehr’s “Cyber Innovation Hub”. The example of Israel shows just how innovative close cooperation between the military, society, and the defence industry can be. There, every industry representative, whether startup or large corporation, was in the military in his youth. They know and understand each other, and together they are fighting a threat to which the population does not turn a blind eye. In the US, we see that innovation also thrives on being able to tap into a huge market through a single point of contact. In Europe today, the opposite is often still the case. We have to change that. More European cooperation in politics and industry means more technological progress and more security!
1ISO/IEC/IEEE 24748-7000:2022
Celia Pelaz is responsible for HENSOLDT’s business development and strategy as Chief Strategy Officer. The industrial engineer has been a member of the HENSOLDT AG Management Board since 2021 and is also responsible for HENSOLDT Ventures, the HENSOLDT Group’s own tech incubator, as well as the Spectrum Dominance and Airborne Solutions division, which she headed directly until the fall of 2022. Previously, Celia Pelaz was Head of Strategic Business Development at HENSOLDT, having been appointed Head of Transformation and Corporate Functions at Airbus Defence and Space Electronics in July 2014. During her 14-year tenure at Airbus, Celia Pelaz also worked in Brazil, where she played a key role in building the company’s presence in the market as Programme and Campaign Manager for Public Security Programmes.
As an entrepreneur, lecturer, and essayist, Professor Yvonne Hofstetter has been concerned for many years with advancing digitalization, its potential, and the associated social upheavals. The trained lawyer has already founded and successfully developed several IT companies, including Teramark Technologies for software development and the AI startup “21strategies”. At the intersection of IT, ethics, politics, and law, Yvonne Hofstetter addresses the implications for liberal societies and self-determined individuals when scopes are increasingly shaped by algorithms and AI. In 2020, she was appointed Honorary Professor of Digitalization and Society by Bonn-Rhein-Sieg University of Applied Sciences, where, among other things, as a pioneer of value-based engineering, she prepares its implications specifically for the Bundeswehr. Yvonne Hofstetter is also a member of the Data Protection Advisory Board of Deutsche Telekom AG, a member of the Scientific Advisory Board at the Institute for Digital Ethics at Stuttgart Media University, and a member of the Commission for Democracy and Technology at the British think tank Chatham House in London. The author of three books was awarded the Theodor Heuss Prize in 2018.