
The IPU Science for Peace Schools
Chapter 1: Applied Science
Topic 1 – Planning for the large scientific challenges
Speaker: Dr. Markus Nordberg (Head of Resources Development, Development and Innovation Unit, CERN)
Scientific Experiments
Dr. M. Nordberg (CERN) said that scientific experiments were typically planned and executed under very specific circumstances. There were four key criteria that needed to be met.
First, new experiments could take place when the main goals of any ongoing experiments had been achieved or when their discovery potential was in decline. It was normal for the rate of new knowledge and discoveries to slow down in the life cycle of any experiment. At some point, further work or effort may not achieve certain results or leverage the desired return on investment.
Second, it was crucial that scientists had the ability to dream and contemplate new ideas. Dreaming was the glue that held all scientific research and developments together.
Third, there needed to be sufficient opportunity, not just in the scientific domain, but also the wider geopolitical context. There was a continuous dialogue between science and politics, which the Science for Peace Schools would help to unlock.
Fourth, a tolerance for research activities was vital. In the context of experiments, tolerance referred to people working together because they shared the same passion for science. It was important that people were prepared to work together, even if they did not necessarily share the same views or perhaps even like one another. There was a richness in disagreeing with colleagues. It was often assumed that good collaborative relationships consisted of people who regularly shared the same views. Such an assessment was fundamentally incorrect. Success could not occur if everyone agreed. To effectively make leaps in orders of magnitude where there was not a clear path to results, diversity was a cornerstone of collaboration, and it was crucial to accept the passion and purpose of everyone involved.
Scientists cannot predict the future
To plan experiments, the shared dream or purpose needed to be articulated in ways that could be measured compared with the goals and time frames of achievement. It was a common perception outside the scientific community that success in science referred exclusively to binary yes/no questions or finite truths. However, such perceptions were inaccurate. Results in science were more often measured on a spectrum or scale. It was important that everyone agreed on the same measurement or articulation methods. Physics had different expressions of goals, and usually revolved around questions such as why something existed or why something occurred in a particular way. For more specific areas, there was a shift in the line of questioning that focused on whether something in fact existed. In the context of the Higgs Boson elementary particle, questions were asked about its existence some 60 years ago. It was ultimately discovered in 2012. When articulating and agreeing on a shared purpose, it was important to avoid being too specific, with sufficient room allocated for serendipity or ambiguity. Both concepts were exceptionally powerful if carefully nurtured. Scientists could not predict the future; it was crucial for them to be comfortable with the prospect of being uncertain. Serendipity should not be understood as not knowing what was happening, but rather as an inability to predict a specific outcome ahead of time.
Collaborations among Scientists are formed organically
Collaborations among scientists typically formed organically, sometimes even starting from conversations over lunch in the CERN cafeteria, and took on a bottom-up approach. The best collaborations started from a group of passionate individuals contributing to a dream that focused on a specific goal. He encouraged participants in the Science for Peace Schools programme to talk to the scientists in the cafeteria and strike up conversations so as to hear about their passion, dedication and excitement for research. A useful tool used to formalize collaborations among partners was a memorandum of understanding, which laid out the initial capital investment, the basis for project structures and the overall relationship. Memorandums of understanding also encapsulated the nature of expected deliverables, the recognition of contributions and how resources were used and reported. The concept of deliverables was particularly important, as it allowed each participating institution to agree on their individual measurable contribution. The relationship with the host lab, as well its involvement and support, was especially significant in terms of administrative guidance and reporting mechanisms. It was crucial to agree on the procedures on how to solve challenges, both technical and administrative, how to disseminate information within the collaboration, and engage with different parties when needed. Foresight was needed to plan for when issues arose and how they would be addressed. Most often, memorandums of understanding took many years, if not decades, to build and implement.
Scientific democratic decisions through equal representation of ideas
To ensure experiments took place effectively, it was important to provide for consensus, ensuring that all parties felt comfortable before proceeding. Everyone was given the opportunity to influence decisions through equal representation or voting mechanisms. Transparency was a key principle, along with a collective understanding that experiments would not work if the contribution of each and every party was not delivered properly. There was a generous approach in sharing in success. Everyone who participated in an experiment, no matter how small, would have their name printed in the resulting scientific journal. In many cases, such an approach resulted in thousands of names appearing in a single scientific paper. Understandably, there were rigorous processes that were followed internally to ensure that what had been discovered was correct and that everyone could agree on verifiable data. Such procedures and committees ensured that there was careful monitoring, and funding agencies could understand how their financial contributions were being spent.
Rules for experiments
There were five simple rules for experiments: 1) allow people to dream; 2) accept diversity; 3) utilize a bottom-up approach with leaders elected based on technical competence, credibility and trust; 4) allow for a spirit of collaboration and friendly competition; and 5) question and justify all aspects using experimental proof to find agreement.
The experience at CERN had demonstrated that there were lessons for the future. Ultimately, collaborations and large experiments could take place, but the quality of new ideas, the availability of technologies and resources, and the geopolitical situation were fundamental. It would be advisable to pay greater attention to addressing the effects of economic uncertainty and ensure that contingency plans were in place. The CERN model could certainly be replicated. The best contexts in this regard would be open science or open innovation-driven relationships involving complex, large-scale undertakings where the risk-benefit ratio was high, where a collaborative approach was required, and when it was impossible to set out all the steps in advance.
Topic 2 – How CERN technology makes its way into society
Speaker: Dr. Han Hubert Dols (Head of Business Development & Entrepreneurship at CERN)
CERN as incubator of ideas and accelerator of inventions
Dr. H.H. Dols (CERN) said that it was a common misconception that staff members at CERN had a background in physics. A wide cross section of technological and engineering fields was represented, including welding, electronics, programming, robotics and automation. The work performed at CERN had inspired the development of numerous disruptive and revolutionary technologies that had made their way into society, such as the World Wide Web, the mouse trackerball and the touchscreen. CERN was therefore working to accelerate such innovations and developments. It was believed that CERN technology could support development and encourage innovation the most in such areas, and address many of society’s present-day challenges.
Collaborations often occurred by coincidence or serendipity. To facilitate strategic connections, encourage engagement and ensure a greater frequency of collaborations, CERN had taken numerous actions within its network. First, many conferences and events were proactively organized with industrial players where it was believed that CERN technologies could potentially be of substantial benefit. Second, intellectual property was carefully managed to establish clear indications of property ownership. Third, almost all software used within the Organization was managed on an open-source basis. Lastly, support was provided to personnel who wished to start up their own businesses.
However, reconciling CERN’s technology capabilities and expertise with commercial markets posed certain difficulties. The Organization did not possess the necessary business intelligence and knowledge. CERN did not function in the same way as a typical business, which typically would start with the needs of a particular market or how to address a specific need or problem. Similarly, promoting and pushing innovative technologies onto companies may not prove to be particularly effective in encouraging uptake, as there may not be a business case for certain companies. A hybrid approach that reconciled technology push alongside market pull proved to be most effective. In coordination with its Member States, the hybrid approach focused on five strategic areas for proactive action: health care, the environment, digital, aerospace and quantum.
Understanding the potential unaddressed needs across different industries
At the lowest level, there was an active examination of new technologies and know-how among the Organization’s technical departments and experts to understand the uniqueness of CERN’s capabilities. In parallel, attendance at industry events and conferences provided a basis for understanding the potential unaddressed needs across different industries and the typical problems faced by society at large that could be solved. At the next level, technical and intellectual property dossiers were developed by the technical departments and experts, which were then translated into a clear value proposition for companies, so that they could understand how the technologies specifically related to their businesses’ needs. At the top level, it was essential to mobilize technical experts so they could contribute meaningfully to projects that had a strong positive focus on or linked to general sustainability goals. To identify such projects and which companies would be open to such contributions, CERN liaised with representatives from its Member States. The profile of an innovation partner was typically a start-up or a company that was open to new technologies and know-how, with an innovation horizon of between five to ten years.
Once a company had been identified, there were typically three key stages to successfully shape and nurture innovation partnerships. The first stage involved understanding a company’s needs. Discussions on innovation with technical or research and development teams generally took place with a view to understanding the company’s ambition and identifying the gaps in their expertise. Occasionally, several director-level individuals from the company may be invited to a discovery day programme at CERN, which would include a set agenda of topics to discuss or visits to departments or labs that would be of potential interest. It was hoped that such actions would lead to interest from both parties.
After establishing potential areas of action, the second stage involved discussing certain specific commitments, such as resources, compensation and intellectual property, which typically involved collaborative research and development activities.
Partnership with CERN
Lastly, the partnership needed to be formalized. There were numerous formats that could be implemented to collaborate with industry and institutional partners. First, a licence agreement, which would allow a specific technology that had been developed and invented by CERN to be used by a company in a specific field. Second, a consultancy-type arrangement, which could help to advance certain problems, provide new insights and tackle certain challenges. Third, CERN could leverage its technologies and facilities to perform research under contract. Fourth, collaborative research and development, which allowed both parties to investigate a general issue and jointly work to find a solution. Generally, it took many years for such partnerships between industry and institutional partners to develop.
One such partnership had been established with MedAustron, Austria, and the National Center for Oncological Hadrontherapy, Italy, that provided hadrontherapy to treat patients with specific types of cancer. Such treatment was particularly successful against deep tumours, against which traditional radiation therapy had proven to be ineffective. Aspects of CERN’s technologies were used within the equipment used to provide the therapy. Both facilities used a particle accelerator, housed in a room adjacent to the treatment room, to direct a beam of protons onto the tumours.
Another partnership was with the Bundesdruckerei Group, Germany, in relation to the development of material sciences and technologies in the fields of identity management, cryptography and data handling, especially passport and banknote production.
CERN had also developed machine-learning algorithms that processed data quickly and generated accurate calculations, usually in a matter of nanoseconds, using specific items of software and hardware. ZENSEACT (a Volvo Cars subsidiary company) had entered into a collaboration with CERN to leverage such technologies with a view to ensuring that motor vehicles could never hit pedestrians.
In regard to data science, CERN was collaborating with Wageningen University & Research and The Commodity Risk Management Expertise Center to develop new methods to protect the commodities and financial markets from fraud and spoofing, by identifying anomalies that were particularly difficult for regulators to detect. Combining different technologies from different sciences had leveraged new insights that were producing tangible results.
CERN was also working with the European Space Agency on a contract research basis on satellite technology that was studying the effects of radiation on electronics.
One of the participants asked if CERN had purchased any technologies from external parties or if all technologies used within the Organization were developed exclusively on an in-house basis.
Dr. H.H. Dols (CERN) said that CERN spent significant amounts of money on procuring technologies from different parties. It was interesting to note that it was sometimes challenging for external companies to meet CERN’s demanding requirements and expectations regarding precision and accuracy. However, help and support was provided in that regard, and allowed such companies to become better competitors in their markets. It was also true that many companies performed better in certain areas than CERN, which was why procurement was necessary.
One of the participants asked if there was a difference between the technology transfer that occurred within universities and the technology transfer at CERN.
Dr. H. H. Dols (CERN) said that the research carried out at universities was often more applied and more directly related to the development of a specific product. In many cases, universities received funding from industry. CERN’s technology, on the other hand, was more niche and focused on relatively extreme areas of technology research, which did not typically have applications for everyday products. Universities were also more willing to take a more direct stake or investment in start-ups.
A new initiative entitled the CERN Innovation Programme on Environmental Applications had been established with a view to leveraging the Organization’s technologies to benefit the environment. There were four key areas in this regard: renewable and low-carbon energy, sustainability and green science, clean transportation and future mobility, and climate change and pollution control. One such example was with Tokamak Energy, in the United Kingdom, where a consultancy arrangement had been implemented to provide expertise and capacity in the simulation of currents and magnetic fields in fusion power. CERN was also collaborating with ABB in developing a digital representation of CERN’s entire cooling and ventilation system with a view to becoming more efficient and reducing the energy used in that regard. It was interesting to note that electric motors represented approximately 30% of all energy consumption in big industry. By applying sensors to motors that detected vibrations, it was possible to identify how an engine was performing and how it subsequently could become more efficient. An agreement had been reached with ABB to publish and share the related data among other big industry players so that they could similarly benefit, and as a result reduce their electricity consumption. It was recently announced that CERN would collaborate with Airbus to assess super-conducting technologies for future zero-emission aeroplanes, with a view to potentially using hydrogen on-board.
The Organization had also helped to facilitate the work of start-ups, both in regard to existing companies and employees who wished to set up their own company, through the CERN Venture Connect programme. CERN was not an incubator or a venture capitalist, but rather brought together a network of venture capitalists and incubators to accelerate development. There were many different start-ups that utilized CERN’s technologies, such as: InsightART, which used Medipix chips to analyse layers in paint to authenticate paintings; TIND, which was used to digitize archives, texts and library systems; and MARS Bio Imaging, which produced colour X-ray imaging.
Working with CERN and innovating together brought numerous benefits to society, but there were some key lessons that had been learned from previous collaborations. CERN was particularly strong in its research at the extreme ends of the technology scale, which could help solve societal challenges. It was important to have passionate experts in both parties to understand business problems and products, and to understand how to use certain technologies more comprehensively. Every collaboration needed to start with a concrete project and a clear business need. Companies usually operated with shorter time frames and business cycles, which were not necessarily reflected at CERN. Ultimately, courage was an essential trait, as it was generally unclear which projects would ultimately be a success and potentially be a disruptive force in society.
Topic 3 – Leveraging anticipatory science diplomacy for the SDGs
Speaker: Ms. Marieke Hood (Executive Director Impact Translator at GESDA)
What is GESDA?
Ms. M. Hood (Executive Director Impact Translator, Geneva Science and Diplomacy Anticipator (GESDA)) said that the aim of the present workshop was to present several approaches and perspectives in regard to anticipatory science and diplomacy. She would welcome feedback and questions to ensure the approaches put forward could adequately serve the goals of parliaments around the world. The IPU and GESDA had worked together for many years in shaping the vision of science and diplomacy, and crucially their combined interaction with both international organizations and parliaments. The workshop would provide an opportunity to showcase the partnership between the two organizations.
GESDA was an independent, non-for-profit organization that was established in September 2019 by the Government of Switzerland. The purpose was to develop a new diplomatic instrument that would provide greater and more effective anticipation of future scientific breakthroughs. GESDA had several key objectives. First, to leverage and optimize the benefits and opportunities of future technological developments. Second, to lay the groundwork for policymaking and multilateral governance in preparation for the implementation of such technological advancements. Third, to widen the circle of beneficiaries of advances in science and technology, which was rooted in the universal human right to benefit from the opportunities presented by science.
The methodology of GESDA
GESDA’s methodology focused on a series of structured actions under a multi-stage umbrella approach entitled the Anticipatory Situation Room. The first step was to anticipate science through cutting-edge research in collaboration with scientists around the world. Scientific breakthroughs and discoveries were collated into an annual report and online platform entitled the GESDA Science Breakthrough Radar, which was a single point of entry to review such emerging topics. The aim of the Breakthrough Radar instrument was to become as comprehensive as possible. A group of approximately 1000 scientists mapped and categorized scientific developments into four separate domains: quantum revolution and advanced artificial intelligence, human augmentation, eco-regeneration and geoengineering, and science and diplomacy. Each domain also comprised multiple subdomains. Scientists were asked for their expectations and the associated positive developments and negative repercussions over a horizon of 5, 10 and 25 years. As an example, significant developments were expected in the area of brain monitoring and neurodegenerative conditions. Positive developments in this regard could help patients to manage various conditions, but there were risks in relation to the potential control that could be exercised over a person’s brain and any undermining of human rights. Ensuring a balance between leveraging opportunities and mitigating risks was essential.
The second part of the methodology was to raise awareness of scientific developments among the diplomatic community through GESDA’s diplomatic forum. The aim of the forum was to foster discussions and reflections on what actions and which domains should benefit from the attention and focus of the multilateral governance system, so as to shape applications for the benefit of humanity and adequately prepare for the future. Another way to raise awareness and activate conversations among the diplomatic community was the GESDA Summit, which was held every year in October. The aim of the Summit was to foster conversation, with a view to establishing collaborations on the governance of scientific breakthroughs in the future, and crucially to begin work on creating and designing initiatives together.
Mr. D. Naughten (Chair, IPU Working Group on Science and Technology) said that one of the criticisms aimed at politicians was a perceived delay in their responses to technological developments and innovations. He asked how the information generated through GESDA could be applied in practical terms and what action could be taken in the immediate future.
Dr. G. Alabaster (Head of Geneva Office, UN-Habitat) said that there was a significant gulf between pure scientists and diplomacy. The role of an organization such as UN-Habitat was to distil highly complex scientific discoveries and innovations to a level that could be applied practically in the design of new programmes and approaches. Such support would enable governments to work out how best to embrace the best available solutions and bring them closer to their existing governance structures and system. Unsurprisingly, there were challenges that needed to be understood and addressed, notably because approaches to governance varied around the world. UN-Habitat provided such support and considered multiple perspectives in its discussions with partners, looking at what would be happening in the future and helping to determine some of the problems for scientists to investigate. There was also an active campaign of information gathering to understand the priorities of communities. Sometimes, development organizations did not put forward the best recommendations, and so required consistent feedback from communities via their elected representatives to shape and re-orientate the direction of travel.
Mr. D. Naughten (Chair, IPU Working Group on Science and Technology) said that alongside GESDA’s efforts to map emerging developments, proper regulatory systems needed to be in place that facilitated such developments. In certain cases, however, current legislation could be a barrier for countries in regard to attracting researchers and being at the cutting edge of development. In the example of medical conditions in the brain, certain developments could offer an opportunity to treat such conditions, but research and data was hindered by national ethical and privacy laws. Such aspects needed to be addressed in advance of technologies and issues entering the mainstream, in particular at the parliamentary level. He asked how GESDA took the information and data on emerging issues and relayed it among parliamentarians.
Leveraging future breakthroughs for the SDGs
Ms. M. Hood (Executive Director Impact Translator, GESDA) said that, in the context of quantum computing, GESDA’s mapping work had been presented to its diplomatic forum. The feedback received through the forum indicated that there would be significant implications and repercussions for such developments across almost all industries and domains. Another issue highlighted was that the richest nations and several large technology corporations seemingly exercised almost exclusive control and power over the technologies. Democratization and inclusivity in the development process were essential. The fact that quantum computers were not expected to be ready for another 10 years provided a significant time period to involve other partners and the Global South in development.
Funding was also an aspect for further development. There were potential applications for the technology that did not yet have business models in place or financial sponsors. It was also possible for the technology developments to be oriented towards achievement of the Sustainable Development Goals, but no one was actively exploring or funding projects in this regard. It was important to remember the geopolitical ramifications of technological developments, even if the technologies were some time away. To mitigate pain points and propose applicable solutions, GESDA had set up a task force comprising leaders across industry, diplomacy and politics, as well as academic researchers. One of the outputs of the task force was the establishment of the Open Quantum Institute, which provided global and inclusive access to state-of-the-art quantum technology.
Discussion
Mr. M. Omar (IPU) asked Mr. Naughten, in his role as a parliamentarian, what information could be provided that would engage parliamentarians with a view to leveraging the most impact and ensuring that there was an evidence-based approach in methodologies.
Mr. D. Naughten (Chair, IPU Working Group on Science and Technology) said that it was essential that the potential applications and the implications over a horizon of 10, 20 and 25 years needed to be clearly outlined to parliamentarians. Any barriers to implementation should also be communicated. In the context of medical conditions in the brain, domestic legislation was a barrier for such research in Ireland. Scientists would turn their attention to countries where there were fewer restrictions in place, and Ireland would consequently slip further down the list in terms of delivery of the potential opportunities. In the context of developments in artificial intelligence, currently, there was no regulatory regime in place around the world. Information relating to potential regulations could therefore be provided to parliamentarians. To comprehensively engage parliamentarians, it was essential to start with the end goal and work backwards to the current state of play to engage parliamentarians today.
Ms. M. Hood (Executive Director Impact Translator, GESDA) said that similar feedback on neurological research had been received through the GESDA task force. Industry partners and scientists who were particularly interested in carrying out research in the correct way were calling for regulatory frameworks. However, if there were too much regulation in one jurisdiction, less-conscientious parties would move their research to another. There was therefore a need for multilateral legislation to ensure a level playing field. The involvement of scientists and the equal representation of all geographies in developing legislation was essential. There could be no reliable assessment of the potential benefits and risks without their involvement.
Mr. D. Naughten (Chair, IPU Working Group on Science and Technology) said that it was important to ensure that different perspectives in relation to ethics and regulation were represented. Universal access to information was fundamental. It should not necessarily be purely Western ethical perceptions that governed what activities took place, but rather a global ethical perspective. Global engagement in regard to ethical standards was crucial.
Mr. M. Omar (IPU) said that, over the last two decades, science had been driven by market forces. Financial returns were the indicator for the direction of travel in regard to research. It was important to bring science closer to humanity and society. Parliamentarians should consider regulation and ethical perspectives, but it was equally important to remain open to research. It should be remembered that in the past, research into the human body was significantly frowned upon by religious institutions, but research led to discoveries and understanding that were commonplace in today’s society. How science was perceived by society was ultimately how it would affect humanity.
Ms. M. Hood (Executive Director Impact Translator, GESDA) said linking science to society was ingrained into GESDA’s work. As an example, GESDA had developed a tool entitled the Pulse of Society, which monitored public debate using an algorithm that analysed conversations about emerging topics in the media and on social platforms in real time. The aim was to understand the sentiments and discussions and how they varied across the world and groups. Another similar tool analysed how civil society acted in relation to their advocacy programmes work and measured public contributions. It was difficult to interact with society at large, therefore parliamentarians were essential to GESDA’s work, in particular in relaying messages from civil society across different geographies.
Ms. A. Del Rosso (CERN) said that she was concerned about the use of the word ‘anticipator’. It was not possible to foresee the evolution of science, because science was about being surprised and seeing how things evolved. The process of asking scientists to select items that could have a potential impact on society in the future was unclear and provoked trepidation in relation to how items were selected, and the resulting impact it had on influencing parliamentarians and the direction of investments. Science demonstrated that all future impacts had the same probability of occurring. It was not possible to define what was going to be more important for society. The definition of science focused on the unknown. The democratization of quantum computing was similarly unclear. Opening up the technology to all countries so as to allow them to shape the technology to their own specification was flawed. The technology did not currently exist. The development of quantum computing had the same probability of occurring as any other potential future development.
Mr. M. Omar (IPU) said that there was a significant difference between hardcore science and a curiosity in the scientific domain. Scientists could anticipate what would be discovered, which was the beauty of fundamental science. In his view, GESDA did not represent hardcore science, but rather was the link between the future of science and how curiosity could be linked to society; it provided the possibility to anticipate and model the future of science.
Ms. M. Hood (Executive Director Impact Translator, GESDA), in response to the concerns raised, said that it was the scientists themselves who defined the subsegments used in the Radar. All topics and ideas were accepted and reviewed by consensus and on a peer-reviewed basis. She agreed that there would be the potential for mistakes to be made, however, a new version of the Radar was published every year, therefore it was continuously being adjusted. The intention was to provide a view of the scientific community that was as neutral and objective as possible, and reflected what the scientific community expected from their domain. In relation to the Quantum Research Institute, the aim was to offer scientists the opportunity to contribute to the development of the technology and allow the scientific community to steer the research. The Institute had no intention of conducting research, but rather wished to allow all countries to equally participate.
Topic 4 – Challenges of Fundamental Science: Ethics and International Collaboration
Speaker: Dr. Michael Doser (Senior Research Physicist and Coordinator for quantum sensing at CERN)
The curiosity drive of fundamental science and boundary conditions
Dr. M. Doser said that fundamental science was a curiosity-driven exploration of the universe rather than an application- or goal-oriented exploration. The purpose of it was to obtain a better or new understanding of a particular topic. The process did not have a goal but was about the exploration itself. It was important to be open to serendipity and the role of random chance in the study of fundamental science. The hope was to uncover the unexpected.
Funding agencies tended to be reluctant to support fundamental science as it did not bring immediate societal benefits. There were, however, a great deal of long-term benefits that could be gained from fundamental science.
A number of “science ecosystem” boundary conditions must be met for fundamental science research to work. First, it was vital to accept that assumptions could be challenged and even overthrown. Second, multiple, redundant and different approaches might be needed which could cause wastage. Third, failure was an inevitable part of the process and the answers given might be incomplete, ambiguous and temporary. Fourth, severe criticism by other scientists was likely and even necessary to improve the work. Fifth, resource-limits meant that promising approaches might be dead-on-arrival while intellectual limits meant some answers might never be known. Consequently, the result of the research process was an imperfect, halting, iterative attempt at a better understanding. Funders needed to take the above conditions into account.
Science knew a great deal about the “Standard Model”, including quarks, protons, neutrons, nuclei, atoms, proteins, cells, planets, stars and galaxies. The knowledge was precise, tested and verified. However, very little was known about dark matter and dark energy, which represented 95% of the universe. The overall picture was therefore deeply incomplete. Major resources were needed to acquire more knowledge. The next steps went way beyond what any individual scientist, university or country could do, afford or justify. As such, it was necessary to take a global approach and move towards Big Science.
Big Science
Big Science required major financial resources, major manpower, including expertise and motivation, as well as major energy resources. It also required long-term political, financial and moral support, commitment and stability over decades. In light of scarce resources, scientists would have to focus on a small number of questions. There would also be a need for multigenerational planning since the technologies took decades to develop. The idea would be for one generation to design the technology, the next to build it and the following to exploit it. Targeted streamlined engineering, rather than table-top-tinkering, was vital. Indeed, the engineering must be as cheap as possible, as reliable as possible, as redundant as possible and as fail-safe as possible. Equally, Big Science needed enhanced public and political dialogue, not the building of ivory towers, as well as an acceptance of a certain level of risk given that some experiments would inevitably fail. A decision must be made on how much risk was palatable. All of the above required democratic legitimacy within society, within national and global entities and within the scientific community itself.
Big Science meant focusing on a small number of big projects of the order of one billion euros or more. As such, some science targets would need to be dropped. To acquire democratic legitimacy, it was necessary to organize scientific priorities at the grassroots level. Every relevant stakeholder must be involved in the prioritization process so that all were on board with the final projects. Processes such as Snowmass and the European Strategy Group were already used to establish scientific priorities in the area of particle physics.
Given the vast amount of resources required, there was a need to build big communities around Big Science, which meant the projects would not be nimble. In other words, it would be difficult to change direction while keeping everybody on board. The lack of nimbleness posed an organizational and sociological challenge to collaborations.
Big Science was a rich country's pastime. For example, for every scientist engaged in particle physics, Austria paid about 100,000 euros per year for membership of CERN plus a further 100,000 euros per particle physicist per year to carry out experiments. It was an amount that was not accessible to every economy.
In addition, many projects, such as those that involved the construction of telescopes and accelerators, were likely to take decades. It could be that easier, simpler and quicker methods of doing the same thing were discovered along the way thus cutting down the timescale of the project, as had been the case with the Human Genome Project. Such discoveries were rare but could happen. It was therefore important to set aside money for new discoveries alongside the decadal projects.
The risks of Big Science
There were a number of risks involved in Big Science. One risk was that some of the technology might become obsolete by the time a project was launched, as had been the case with space exploration. Rigidity was another big risk. It was also uncertain whether the interesting questions of today would be relevant tomorrow. Indeed, scientists were constantly discovering new fields and techniques which might overtake previous priorities. Nitrogen vacancy diamonds, for instance, was now a flourishing industrial field but had been completely unknown ten years previously. In addition, there were questions about how to keep scientists engaged in long-term projects given that much of the work was done by engineers. The risk of failure was also there as well as the need to focus on concrete, feasible questions rather than impractical ones. Lastly, there was a risk involved in terms of the compatibility of Big Science with the Sustainable Development Goals. The energy consumption at CERN had reached 1.4 TWh in 2022 but would go up to about 4 TWh if more colliders were put into operation. Four TWh was about one 150 thousandth of the energy used by all of humanity. It was important to ask whether such a high level of energy consumption was justified.
Small Science as an alternative
The alternative to Big Science was Small Science, or fundamental science on a smaller scale. Small Science had great support due to its economic value. Many large countries and economic areas were investing Big Science budgets into numerous Small Science activities, particularly those linked to quantum technologies. The investments were of the order of approximately 10 to 100 billion euros over the next 5 to 10 years per country and aimed to inspire rapid societal impact and sociological change. Among the benefits of Small Science were its affordability and nimble nature as well as its ability to rapidly grow new technology within a few months or years. The risks, however, included duplication of efforts with many people working on the same areas, re-invention of the wheel and an inability to scale. It was also unclear whether it was possible to answer big questions, including those about dark matter, using small devices such as quantum sensors.
Six families of quantum sensors had been identified that were interesting to develop. The next step was to decide on where the technologies would be built. Activities were already going on in Europe, Asia and the United States of America. It would therefore be beneficial to create international collaborations. He warned against parochialism urging countries not to stay stuck in their own “playground”. There was also a need to take a dual approach, bridging fundamental research with applied science and creating devices that would be useful for both.
However, Small Science must also be made available to the rest of the world, including South America and Africa, which could still end up being expensive. There would be start-up and operation costs thus bringing the total amount up to about 100,000 euros per physicist. Disparities in educational backgrounds could also hamper efforts to expand in those regions. It was therefore important to improve the level of education across the globe.
Complementary approach between Big Science and Small Science
He was in favour of a complementary approach between Big Science and Small Science which allowed for table-top fundamental physics alongside long-term Big Science projects. Global networks should be open from the get-go, allowing anybody to join and contribute with whatever resources they had. It would be an on-ramp for small institutions or economies, enabling them to participate without having to provide the hardware.
A complementary approach would require participants to balance priorities and split the funding. In his view, the optimal way to split the funding was to give 40% to Big Science, 40% to Small Science and 20% to blue-skies research. The split was in line with the concomitant failure rate that was acceptable for each area, namely 5% for Big Science, 30% for Small Science and 90% for blue skies research. It was necessary to play it safe when spending large amounts of money which meant that the failure rate of Big Science must be kept low. Blue skies research, however, must have a high in-built failure rate and funders must expect most of the money to be wasted.
Physicists must change their attitudes and come to see failure as good. It was through failure that solid ground could be built. There was also a need to diversify approaches as many would inevitably be unsuccessful. Diversity of communities was also important. By participating in research in concert with the global scientific community, countries would help educate their populations scientifically and technologically in a way that would have global impact but also bring national development.
In summary, fundamental science aimed to attract technical as well as curious individuals, foster creativity, and generally contribute to the welfare and cohesiveness of society. To do so, it must reflect societal changes, be open to and aware of the world in which it operated and be willing to build on