Centre for Mathematical Sciences – research grants
Centre for Mathematical Sciences at the University of Plymouth – research grants information
Ongoing projects
Women in Mathematics Day
Funding: £1600
Funding Agency: London Mathematical Society
Investigators:
Dr Yinghui Wei
(PI, Plymouth),
Dr Nathan Broomhead
(Co-I, Plymouth),
Dr Colin Christopher
(Co-I, Plymouth), Robertz, D (Co-I, Plymouth)
Randomised clinical trial of the enVista trifocal intraocular lens
Funding: £46,600
Funding Body: Promedica
Investigators:
Dr Craig McNeile
(Plymouth) and
Dr Alexander Besinis
(Plymouth)
Duration: 24 months
Using big data to develop and validate clinical prediction models for survival outcomes in kidney transplant
Funding: £84,786 (with £34,000 from NHS)
Funding agency: EPSRC/NHS
Investigators:
Dr Yinghui Wei
(Pl, Plymouth)
Duration: Oct 2020–Sep 2024
Details of grant
Kidney transplantation is the organ transplant of a kidney to a patient with end-stage kidney disease. When a donor kidney is offered to a waitlisted patient, the clinical team responsible for the care of the potential recipient must make the decision to accept or decline the offer based upon complex and variable information about the donor, the recipient and the process of transplantation. Predicting graft and patient survival following transplantation is important to support this decision-making process.
While studies have been conducted to predict graft failure following kidney transplantation, they did not focus on patient survival and were based on a limited set of variables. There is a clinical need to develop new statistical methods using big data to better predict graft and patient survival in transplant recipients.
This project brings the opportunity to seek to use the linked registry data from national databases to develop and validate clinical prediction models for survival outcomes. The project will aim to integrate data from multiple sources, develop models to predict risks of graft failure and death over time, and conduct an internal and external validation of the developed prediction models.
Fresh perspectives for QED in intense backgrounds: first quantised techniques in strong field QED
EP/X02413X/1 (UoP) and EP/X024199/1 (UoE)
Funding agency: EPSRC
PI:
James P. Edwards
(UoP) / Co-PI: Anton Ilderton (UoE)
Funding total: £600k
Duration: 2 years (Aug 2023 – Aug 2025)
Summary of project:
Quantum electrodynamics (QED) governs the way that light and matter interact and it is our best-tested theory of fundamental physics. Problems in QED can very often be approached, as with other physics problems, in an approximate scheme called perturbation theory. Here one performs calculations to low order in a suitably small expansion parameter, which in QED is the well-known "fine structure constant" proportional to the square of the electric charge. Going to sequentially higher orders in perturbation theory may provide higher precision results, but advances are now required in regimes of very high orders, or even all orders, to obtain meaningful theoretical insight and make sufficiently precise experimental predictions. Such a situation occurs in laser-matter interactions, and the associated higher order calculations are prohibitively challenging.
Modern laser facilities create pulses of intense light with very high photon density, often focussing the equivalent of the total light emitted by the sun onto the head of a pin. The interaction of laser photons with matter adds coherently, so these great numbers of incident photons imply that the laser-matter coupling effectively becomes enhanced, from the fine-structure constant to the so-called "dimensionless intensity parameter." This parameter easily exceeds unity at current facilities – and future experiments will reach values of 10 to 100 – which clearly demands a non-perturbative treatment. Fortunately, such an approach to laser-matter interactions is made possible by the "Furry expansion," can be thought of as an improved perturbation theory that includes the effects of laser photons, thereby accounting for large values of the dimensionless intensity parameter. This is the theoretical backbone of essentially all previous, current, and future intense laser experiments.
However, the Ritus-Narozhny conjecture states that as we go ever higher intensities, quantum "loop" effects, which are typically neglected even in the Furry expansion, also become enhanced by laser intensity, to such an extent that all loop orders need to be accounted for – in effect, the Furry expansion breaks down, leaving us without our key theoretical tool. More than simply a technical or mathematical problem, the intriguing implication of the Ritus-Narozhny conjecture is that the high-intensity regime of QED is fully non-perturbative, or "strongly coupled" and therefore inaccessible to standard approximation schemes.
We are therefore currently unable to give any concrete predictions for the physics of this regime, or to answer other questions on the very high intensity behaviour of QED, because non-perturbative calculations in strong fields are prohibitively difficult, at least with the standard techniques employed by the community. Understanding the physics of the high-intensity regime, and identifying "smoking gun" signals of new effects which can be searched for at future experiments, is therefore a challenge which requires new methods.
Worldline techniques are highly valued in quantum field theory for their calculational efficiency, yet their usefulness in SFQED has only recently been noticed, and the take-up of such methods in the UK has been very limited. This project will develop the worldline methods required for studying QFT in electromagnetic backgrounds and apply them to strong field problems. Of particular interest is the ability, in the worldline formalism, to derive "master formulae" for whole classes of higher-order processes; this is something which is currently lacking in strong fields, but which is required if we are to understand the physics of the high-intensity regime where higher order effects become important. The project will support national diffusion of expertise in the worldline approach, to the UK and EU SFQED community, and will shed new light on perturbative and non-perturbative aspects of matter in intense laser fields.
Recently completed projects
Using data to improve public health: COVID-19 secondment
Funding: £117,805
Funding Agency: MRC
Investigators:
Dr Yinghui Wei
(PI, Plymouth)
Duration: 1/10/2021 – 30/09/2022
This fellowship is part of the COVID-19 Longitudinal Health and Wellbeing National Core Study award.
Stochastic wave modelling for inhomogeneous sea-states
Funding: £166,508
Funding agency: EPSRC
Investigators:
Dr Raphael Stuhlmeier
(Pl, Plymouth)
Duration: Feb 2021–July 2023
Details of grant
This project aims to improve modern wave forecasts by developing a better understanding of the nonlinear interaction of random surface water waves. While the simplest mathematical models of waves are linear and assume that waves of different frequencies do not interact, a more detailed understanding of the sea must take into account departures from linearity. In the deep waters of the open sea, where waves are generated by the action of the wind, the fundamental nonlinear energy exchange occurs between quartets of four waves.
These energy exchanges, together with wave-breaking and wind, are the main inputs into wave-forecasting models. Such models inform the surfer waiting for a big swell as well as the engineer planning offshore operations by providing an accurate and timely forecast of wave conditions. Moreover, equations describing the evolution of the sea-state can be used to predict the likelihood of anomalously high rogue waves, which have been implicated in shipping accidents since antiquity.
Current modelling relies on equations that assume waves to be uncorrelated, implying that the sea is spatially homogeneous. This classically results in an equation with a very long evolution timescale and excludes phenomena of physical and mathematical interest. This project is devoted to studying how correlation between wave modes impacts the evolution of wave fields to develop novel equations that can be implemented in wave forecasting systems.
Quantum phenomena in high-intensity laser-matter interactions
Funding: £362,146
Funding agency: Engineering and Physical Sciences Research Council
Investigators: Ilderton, A. (PI, Plymouth), King, B. (Co-I, Plymouth)
Duration: Feb 2019–March 2021
Details of grant
A new era of high-intensity laser experiments has begun.
Recent UK experiments, in which beams of ultra-relativistic electrons were collided with intense laser pulses, have shown that it is possible not only to use intense lasers to probe fundamental physics, but also to generate radiation sources with unique properties, which find applications across the sciences. Such experiments are extremely challenging, and despite recent successes there is disagreement over to what extent quantum effects have been observed. Discrepancies between experimental results and theoretical predictions have been attributed to the numerical models of quantum effects employed in Particle-In-Cell (PIC) codes used to simulate and analyse experiments.
A host of new experiments will begin this year, and will be able to probe the transition from classical to quantum physics in intense electromagnetic fields. It is therefore critical that we improve our understanding of theoretical models, and their implementations, in order to ensure that theoretical predictions and analyses keep up with experimental progress.
To meet this urgent experimental demand we propose developing existing theory on two fronts.
On one front, we will extend existing models to include currently neglected processes (such as absorption and trident pair production) in a systematic way that can be immediately employed by simulators. On the second front, we will analyse a number of quantum effects which cannot be captured by existing numerical models (but which become relevant in e.g. the overlapping field geometries of future facilities, or in dense electron bunches), assess their importance to experimental campaigns, and develop a methodology to implement them numerically, going beyond current models.
Doing so requires a team of researchers who are not only experts in the theory of quantum effects in intense laser physics, but who also have the experience required to understand numerical implementation and experimental analyses. This is not a case of benchmarking existing codes, already well-covered in the literature. What is needed, rather, is a "top down", approach which can verify, and improve upon, the models of quantum effects which are used in the codes.
Plymouth hosts an established, world-leading research group in the area of intense laser-matter interactions. Staff members are research-active and well-known in the community as experts in the theory of quantum effects in intense laser physics. Furthermore, the Investigators attached to this project are actively involved in experimental efforts, being for example part of the team which recently demonstrated radiation reaction in laser-matter collisions in an experiment at the UK's Central Laser Facility.
As such the Investigators have precisely the right skillset to undertake this timely project and deliver new results of import to a wide community of physicists. This will help maintain the UK's world-leading capabilities in the active research area of intense laser-matter interactions.
Lattice Field Theory at the Exascale Frontier
Funding: £68,545
Funding agency: Engineering and Physical Sciences Research Council
Investigators:
Professor Antonio Rago
(PI, Plymouth)
Duration: Jul 20–Sep 21
Lattice Field Theory (LFT) provides the tools to study the fundamental forces of nature using numerical simulations. The traditional realm of application of LFT has been Quantum Chromodynamics (QCD), the theory describing the strong nuclear force within the Standard Model (SM) of particle physics. These calculations now include electromagnetic effects and achieve sub percent accuracy. Other applications span a wide range of topics, from theories beyond the Standard Model, to low-dimensional strongly coupled fermionic models, to new cosmological paradigms. At the core of this scientific endeavour lies the ability to perform sophisticated and demanding numerical simulations. The Exascale era of High Performance Computing therefore looks like a time of great opportunities.
The UK LFT community has been at the forefront of the field for more than three decades and has developed a broad portfolio of research areas, with synergetic connections to High-Performance Computing, leading to significant progress in algorithms and code performance.
Highlights of successes include: influencing the design of new hardware (Blue Gene systems); developing algorithms (Hybrid Monte Carlo) that are used widely by many other communities; maximising the benefits from new technologies (lattice QCD practitioners were amongst the first users of new platforms, including GPUs for scientific computing); applying LFT techniques to new problems in Artificial Intelligence.
The research programme in LFT, and its impact, can be expanded in a transformative way with the advent of pre-Exascale and Exascale systems, but only if key challenges are addressed. As the number of floating point operations per second increases, the communications between computing nodes are lagging behind, and this imbalance will severely affect future LFT simulations across the board.
These challenges are common to all LFT codebases, and more generally to other communities that are large users of HPC resources. The bottlenecks on new architectures need to be carefully identified, and software that minimises the communications must be designed in order to make the best usage of forthcoming large computers. As we are entering an era of heterogeneous architectures, the design of new software must clearly isolate the algorithmic progress from the details of the implementation on disparate hardware, so that our software can be deployed efficiently on forthcoming machines with limited effort.
The goal of the EXA-LAT project is to develop a common set of best practices, KPIs and figures of merit that can be used by the whole LFT community in the near future and will inform the design and procurement of future systems. Besides the participation of the LFT community, numerous vendors and computing centres have joined the project, together with scholars from 'neighbouring' disciplines. Thereby we aim to create a national and international focal point that will foster the activity of scholars, industrial partners and Research Sotfware Engineers (RSEs). This synergetic environment will host training events for academics, RSEs and students, which will contribute to the creation of a skilled work force immersed in a network that comprises the leading vendors in the subject.
EXA-LAT will set the foundations for a long-term effort by the LFT community to fully benefit of Exascale facilities and transfer some of the skills that characterise our scientific work to a wider group of users across disciplines.
The Universe at Extreme Scales
Funding: £39,766
Funding agency: Science and Technology Facilities Council
Investigators:
Dr Craig McNeile
(PI, Plymouth)
Duration: Oct 20–Sep 23
Research in particle physics and cosmology connects the largest scales, those of the Universe as a whole, with then smallest, namely those of fundamental particles. By trying to understand how the Universe evolved after the Big Bang, we may gain insight into which particles are yet to be discovered, e.g. at the Large Hadron Collider (LHC), and vice versa.
Concerning the early Universe, it is commonly understood that it underwent a period of rapid expansion, called inflation. However, many open questions remain. For instance, what is the mechanism of cosmological inflation, and, can we link inflation to quantum gravity, a theory that still eludes us? Interestingly, the recent observations of gravitational waves may provide a guide here. Inflation predicts a gravitational-wave background with properties depending on the details of the inflationary model. Hence if this background is observed, it may help us to further uncover details of the inflationary epoch after the Big Bang. Gravitational waves may also shed light on other puzzles, namely those related to dark energy and dark matter. Again, possible alternative theories to Einstein's general theory of gravity, which are designed to solve the dark energy/matter puzzles, may leave their imprint in gravitational waves.
In contrast to this, the LHC probes the smallest length scales, by colliding protons and nuclei at very high energies. In order to test the Standard Model (SM), our current highly successful theory of elementary particles, to the extreme, it is necessary to compute SM processes to high precision, and make predictions of physics beyond the Standard Model (BSM). The former can be done using advanced techniques which go beyond the usual Feynman diagrams. For the latter, one may take the viewpoint that the SM is an effective field theory (EFT), valid up to a certain energy scale only. To understand which novel BSM interactions can give rise to the SM at low energies, without conflicting with high-precision from the LHC, is an outstanding challenge. Two main classes of candidate theories are so-called near-conformal gauge theories and Composite Higgs models, which both give rise to electroweak symmetry breaking and a light Higgs boson. They may even provide dark matter candidates.
These theories have a commonality with the theory of quarks and gluons, Quantum Chromodynamics (QCD), namely that they are strongly interacting. This implies that they cannot be solved easily analytically, but are amenable to numerical simulations on high-performance computing facilities. The study of QCD provides a link between the physics of the early Universe and elementary particles. Namely, as the Universe cooled down after the Big Bang, it underwent a series of phase transitions. During one of those, quarks and gluons combined into hadrons, i.e. the particles we observe today. The QCD phase transition is currently being explored at the LHC, by colliding heavy ions, motivating quantitative predictions on how the QCD spectrum changes with temperature. In fact, even understanding the QCD spectrum in vacuum is still partly unsolved and may guide toward BSM physics.
Quantum field theories (QFTs) describes physical processes across a vast range of energy scales, from fundamental interactions, as mentioned above, to low-dimensional and condensed matter systems. Many new phenomena and the detailed structure of QFTs are anticipated to lie beyond the confines of traditional perturbative methods or numerical simulations. Dualities provide links between hitherto unrelated theories, making tractable questions previously considered to be out of reach. With new dualities being discovered, the richness of QFT is larger than naively expected. Similarly, dynamics out of thermal equilibrium, the process of thermalisation, or the evolution of quantum information, relevant for black hole dynamics, benefits from new approaches, some of which are motivated by quantum information.
Uncharted regimes of light-matter interactions
Funding: £122,610
Funding agency: Leverhulme Trust
Investigators: Ilderton, A. (PI, Plymouth),
Dr Tom Heinzl
(Co-I, Plymouth)
Duration: Jul 2019–Jun 2021
The potential of accessing a new physical regime within QED has already spurred interest in approaching the regime experimentally. But the conjectured breakdown of perturbation theory points to flaws in our understanding of theoretical methods at high intensity.
Since we currently have no reliable theoretical tools, the results of our research will be significant for theorists working on both particle and laser physics, and experimentalists and simulators working on laser-matter interactions.
High-Dimensional Bayesian Dependence Modelling with Conditional Copulas
Funding: £11,100
Funding body: Royal Society
Investigators: Dalla Valle, L (PI, Plymouth),
Dr Julian Stander
(Co-I, Plymouth), Liseo, B (Co-I, Rome)
Duration: 2017–2019
The project analysed paediatric ophthalmic data from a large sample of children aged between 3 and 8 years. A Bayesian additive conditional bivariate copula regression model with sinh-arcsinh marginal densities with location, scale, and shape parameters that depend smoothly on a covariate was developed. Bayesian inference about the unknown quantities of our model was performed using a specially tailored Markov chain Monte Carlo algorithm. We gained new insights about the processes which determine transformations in visual acuity with respect to age, including the nature of joint changes in both eyes as modelled with the age-related copula dependence parameter. This allowed us to identify children with unusual sight characteristics, distinguishing those who are bivariate, but not univariate outliers. In this way, the project provided an innovative tool that enables clinicians to identify children with unusual sight who may otherwise be missed.
Fields, Strings and Lattices: From the Inflationary Universe to High-Energy Colliders
Funding: £282,295
Funding agency: Science and Technology Facilities Council
Investigators:
Professor Antonio Rago
(PI, Plymouth),
Dr Craig McNeile
(Co-I, Plymouth), Patella A. (Co-I, Plymouth)
Duration: Sep 2017–Sep 2020
Research in particle physics and cosmology connects the largest scales, those of the universe as a whole, with the smallest, namely those of fundamental particles and strings. By trying to understand how the universe evolved after the Big Bang, we may gain insight into which particles are yet to be discovered at e.g. the Large Hadron Collider at CERN, and vice versa, a fascinating prospect!
It is commonly assumed that the early universe went through a period of rapid expansion, dubbed inflation. The mechanisms underlying inflation can be investigated in a number of ways. In the so-called bottom-up approach, one aims to find predictions that are independent of details of models, but only depend on symmetries and the nature of the source of inflation. It is then possible to extract universal features leading to observational predictions and point towards physics beyond our currently known Standard Models of Particle Physics and Cosmology. In the complementary top-down approach, one starts with the given theory, e.g. one that is motivated by string theory, and derives its consequences, which, again might be testable by observations. These approaches can also be used to study the period of cosmic acceleration our universe is currently going through, i.e. dark energy.
String theory is a theory of gravity (and other forces) operating at very high-energy scales. Besides its possible role as a fundamental theory, it has many intricate aspects which require a level of understanding deeply rooted in symmetries and dualities (a transformation that leads to two 'dual' formulations which are superficially very different but yet equivalent). By studying those, one may not only understand string theory better, but also arrive at dual theories which are relevant for e.g. physics beyond the Standard Model (BSM) probed at the LHC, especially if the BSM model is strongly coupled.
In order to make predictions for the LHC, it is necessary to perform very precise calculations, in BSM models and in the Standard Model itself. Some of these calculations can be done by expanding in a small parameter. This does not mean that the computation is easy though, since many scattering processes may contribute. However, it might be that by re-organising these contributions a new, more efficient, formulation can be found.
When there is no small parameter, a theory has to be solved as it stands. Often this can be attempted numerically, by formulating it on a space-time lattice. Since this involves very many degrees of freedom, typically one has to employ the largest supercomputers in the world. The theory of the strong interaction, Quantum Chromodynamics (QCD), is one of those theories in which a small parameter is absent. Although it is formulated in the terms of quarks (as matter particles) and gluons (as force carriers), these are not the particles that appear in the spectrum, which are instead protons, neutrons, pions etc. However, since QCD is so hard to solve, there may be other particles not yet detected and also not yet understood theoretically: examples are so-called glueballs and hybrid mesons. By studying QCD on the lattice, these ideas can be tested quantitatively.
A related question concerns what happens with all these particles when the temperature (as in the early universe) or the matter density (as in neutron stars) is increased. Also this can be studied numerically and a transition to a new phase of matter at high temperature, the quark-gluon plasma, has been observed. Since this phase is currently being explored at the LHC, by colliding heavy ions, quantitative predictions on the spectrum and on transport properties, such as how viscous the plasma is, are needed here as well.
Some BSM models also lack a small parameter and hence are studied using similar lattice computing techniques. By scanning models with distinct features, again hints for the LHC may be found, e.g. with regard to unusual spectral features.
Signatures of strongly interacting dynamics
Funding: £4,098
Funding agency: Science and Technology Facilities Council
Investigators:
Dr Vincent Drach
(PI, Plymouth)
Duration: May 19–Dec 20
The so-called lattice approach is a very successful first principle method that allows to solve Gauge Theories. Calculations resort to large scale simulations and are typically run on the largest supercomputers. Lattice simulations are a unique tool to explore non perturbative phenomena in theories which are not well understood. In Nature, non perturbative phenomena give rise to the mass of the ordinary proton, whose mass mostly come from the binding energy of its constituents: the quarks. Tremendous efforts are being made to design extensions of the Standard Model of particle physics using similar mechanism that could for instance explain the mass and properties of the Higgs boson. The project aims at making the first prediction of decay rates of resonances relevant for phenomenological analysis beyond the Standard Model using lattice simulations, and will thus provide quantitative results that are relevant for experiments searching for new physics like the one performed by the world's largest accelerator: the Large Hadron Collider (LHC).
An assessment of the sensitivity to change of a scale to measure quality of life in patients with severe asthma/ Further validation of a scale to measure quality of life in patients with severe asthma
Funding: £55,754/£44,503
Funding agency: GlaxoSmithKline
Investigators:
Dr Yinghui Wei
(Co-I, Plymouth)
Duration: 2019–2020
Website:
Severe Asthma Research Programme
Website: http://www.saq.org.uk/default.aspx
The quality of life of patients with severe asthma differs from mild and moderate asthma, partly because of a greater burden of symptoms and risk of exacerbations, but also due to differences in treatment, which can have more pronounced side effects. Existing asthma-specific Health-Related Quality of Life (HRQoL) scales are not optimally designed for severe asthma patients.
This project is a collaborative effort between the University of Plymouth’s Faculty of Health: Medicine, Dentistry and Human Sciences, the School of Psychology, and University Hospitals Plymouth NHS Trust. The team consists of Professor Michael Hyland, Professor Rupert Jones, Joseph Lanario, Lucy Cartwright, Dr Yinghui Wei and Dr Matthew Masoli, who is the clinical lead for asthma at the Royal Devon and Exeter Hospital and has established a regional severe/difficult asthma service.
Electron-seeded pair creation in intense laser pulses
Funding: £100,956
Funding agency: Engineering and Physical Sciences Research Council
Investigators:
Dr B. King
(PI, Plymouth)
Start date: Dec 2016–Sep 2018
As the intensity frontier is pushed back in current and next-generation high power laser facilities (currently under construction), our understanding of how to convert light to higher frequencies in a controlled and efficient way and how to convert that radiation into matter and antimatter is increasing. The proposed research will contribute to this effort by establishing how these processes are generated in high-intensity, short laser pulses, allowing predictions from the standard model to finally be verified, or a deviation to be found.
The process of electron-seeded pair-creation, which forms the subject of the proposal, is a central example of a high-intensity quantum phenomenon. Only a single experiment, E-144, which combined a 47GeV electron beam and a 10^18 W/cm^2 laser pulse, performed two decades ago at the Stanford Linear Accelerator Center, has measured this effect and only in the multiphoton regime. They reported observation of the sequential process of nonlinear Compton scattering to produce high-energy photons and their subsequent decay via the nonlinear Breit-Wheeler process, into electron-positron pairs. If this experiment could be performed again with the higher laser intensities available today, the process is predicted to be nonperturbative. These types of processes are of great interest because they are poorly understood and typically occur in difficult parts of the standard model e.g. confinement in QCD is non-perturbative.
The aim of the proposed programme is to calculate electron-seeded pair-creation in a laser pulse. Although this process has been calculated in a constant and a monochromatic field, there has been no full calculation in a pulsed field. Inclusion of the pulsed nature is essential for accurate experimental predictions in high power laser experiments. In addition to the sequential process measured at E-144, there is also predicted to be a simultaneous process in which the photon remains virtual (often referred to as the "trident process"). Such virtual processes are currently neglected by QED laser-plasma simulation codes, which are frequently used in the design and analysis of high-intensity experiments.
A main objective of the research is to ascertain to what extent the approximations used in simulation, such as the field being instantaneously constant during the formation of quantum processes, are faithful to the predictions of QED when the duration of the laser pulse is decreased. This will allow for accurate predictions for future experimental campaigns. A further, and related, objective is to establish under what conditions a separation into sequential and simultaneous processes is at all well-defined as the extent of the laser pulse is reduced where quantum interference plays an ever-larger role. Whilst the approximation of lowest-order dressed processes such as photon decay and nonlinear Compton scattering is well-understood, how to approximate higher-order dressed processes such as electron-seeded pair-creation has yet to be properly investigated.
By working with a project partner who is the principal investigator of an EPSRC-sponsored QED laser-plasma simulation campaign, knowledge-transfer from the research in the form of analytical results and expertise to plasma simulation will be ensured. The final aim of the project is the benchmarking of next-generation numerical codes with analytical results. A main beneficiary will be the high-intensity plasma simulation community and we expect our analysis of approximation to this second-order process to be highly relevant to the simulation of other second-order processes such as double nonlinear Compton scattering in short laser pulses, which become more important as the laser intensity used in experiment increases. In general, the proposed research underpins high power laser science and laser-plasma physics, in line with the UK research portfolio.
Virtual Wave Structure Interaction (WSI) Simulation Environment
Funding: £288,508
Funding agency: Engineering and Physical Sciences Research Council
Investigators: Graham, D. I. (PI, Plymouth)
Start date: Oct 2013–Mar 2017
The project is a close collaboration between STFC-RAL and two universities with significant experience in research into wave interactions with fixed and floating structures working together to combine and apply their expertise to model the problem. The aim is to develop integrated parallel code implemented on a massively multi-processor cluster and mutli-core GPUs providing fast detailed numerical wave tank solutions of the detailed physics of violent hydrodynamic impact loading on rigid and elastic structures. The project is linked to and part of a carefully integrated programme of numerical modelling and physical experiments at large scale. Open source numerical code will be developed to simulate laboratory experiments to be carried out in the new national wave and current facility at the University of Plymouth.
It is well known that climate change will lead to sea level rise and increased storm activity (either more severe individual storms or more storms overall, or both) in the offshore marine environment around the UK and north-western Europe. This has critical implications for the safety of personnel on existing offshore structures and for the safe operation of existing and new classes of LNG carrier vessels whose structures are subject to large and at present unquantified instantaneous loadings due to violent sloshing of transported liquids in severe seas. There exist oil and gas offshore structures in UK waters are already up to 40 years old and these aging structures need to be re-assessed to ensure that they can withstand increased loadings in increasingly adverse seas as a result of climate change, and to confirm that their life can be extended into the next 25 years. The cost of upgrading existing structures and of ensuring the survivability and safe operation of new structures and vessels will depend critically on the reliability of hydrodynamic impact load predictions. These loadings cause severe damage to sea walls, tanks providing containment to sloshing liquids (such as in LNG carriers) and damage to FPSOs and other offshore marine floating structures such as wave energy converters.
Whilst the hydrodynamics in the bulk of a fluid is relatively well understood, the violent motion and break-up of the water surface remains a major challenge to simulate with sufficient accuracy for engineering design. Although free surface elevations and average loadings are often predicted relatively well by analysis techniques, observed instantaneous peak pressures are not reliably predicted in such extreme conditions and are often not repeatable even in carefully controlled laboratory experiments. There remain a number of fundamental open questions as to the detailed physics of hydrodynamic impact loading, even for fixed structures and the extremely high-pressure impulse that may occur. In particular, uncertainty exists in the understanding of the influence of: the presence of air in the water (both entrapped pockets and entrained bubbles) where the acoustic properties of seawater change leading to variability of wave impact pressures measured in experiments; flexibility of the structure leading to hydroelastic response; steepness and three dimensionality of the incident wave.
This proposal seeks to improve the current capability to directly attack this fundamentally difficult and safety-critical problem by accelerating state of the art numerical simulations with the aim of providing detailed solutions not currently possible to designers of offshore, marine and coastal structures, both fixed and floating.
A systematic review of physical activity for alcohol and substance use disorders: evidence synthesis with stakeholder engagement to formulate practical recommendations
Funding: £154,528
Funding agency: National Institute for Health and Care Research
Investigators:
Dr Yinghui Wei
(Co-I, Plymouth)
Start date: Sep 2016–Jun 2018
Alcohol use disorders (AUDs) and substance use disorders (SUDs) have significant avoidable global human and economic cost. In the UK, AUDs and SUDs cost the economy £21bn (£3.5bn in healthcare) and £15bn (£488m in healthcare), respectively. Pharmacological interventions inherently have complications, and alternative therapies for treatment and prevention are needed. There is a growing interest in the possible role that physical activity (PA) may offer to reduce AUDs and SUDs with minimal or no adverse effects. Little is known about the best way PA can be promoted to both prevent and reduce AUDs and SUDs and NICE guidelines currently make minimal and vague reference to its role1–7.
Aims
To systematically review the evidence to date in order to describe and quantify the effects of PA on AUDs and SUDs and understand how it is best delivered or promoted, in what setting, and among which populations, to encourage the prevention, reduction, and abstinence from AUDs and SUDs.
In the final phase of the study our aim is to elicit the views of service leads and users on how the findings can be used to guide future funding and interventions for reducing progression, use and post-treatment relapse.
Plan of investigation
A wide selection of electronic databases will be searched based on a list of key words to generate a list of published research (including grey literature and qualitative investigations) which will then be screened independently by two researchers according to a predefined checklist. Eligible studies will be rated for quality and risk of bias, and data on study details, participant characteristics, AUD/SUD related factors, intervention details and setting, control conditions, and outcomes will be extracted by one researcher and checked by another. Data across similar studies will be synthesised in a meta-analysis. Moreover, we will provide a detailed narrative synthesis using tables, diagrams and narrative texts across the studies, interventions, outcomes, populations and settings.
Potential benefits to people and NHS
The proposed review will present the evidence to date on the role of PA in the prevention, harm reduction and treatment of AUDs and SUDs. After summarising this literature service leads and users will have the chance to reflect on and add further guidance on how PA interventions can be designed to have greatest reach and effectiveness. They will also add to recommendations on how support for PA can be offered within the NHS, and other public health services, and appropriate third sector and charity organisations, with implications for funding. The review will examine the many aspects of PA in how it is delivered, to whom, by whom, and in what setting to present the potentially most effective services. It will provide information for those involved in the treatment and prevention of AUDs and SUDs as to the most effective and cost-effective applications of PA. It will also highlight potential research gaps to allow for future research planning within the NHS. The review will also be presented to the appropriate NICE Guidelines review panel with ideas on how the findings could be incorporated into future revisions of guidance on effective interventions. Should there be a need for further research on the effectiveness and cost-effectiveness of PA interventions for specific groups then we will present the findings to the appropriate NIHR prioritisation panel, and other funders (e.g. National Lottery awards).
A CCP on Wave/Structure Interaction: CCP-WSI
Funding: £483,159
Funding agency: Engineering and Physical Sciences Research Council
Investigators: Graham, D. I. (Co-I, Plymouth)
Start date: Oct 2015–Sep 2020
The proposal is to establish a new Collaborative Computational Project (CCP) serving the UK research community in the area of wave structure interactions (WSI). The new CCP-WSI will bring together computational scientists, Computational Fluid Dynamics (CFD) specialists and experimentalists to develop a UK national numerical wave tank (NWT) facility fully complementary to existing and future UK experimental laboratory facilities for marine, coastal offshore engineering and thus support leading-edge research in an area of high national importance. Substantial progress has been made on a number of past and current EPSRC project grants held by the lead partners in this CCP bid to develop and test the primary elements of a numerical wave tank and to carry out cutting edge wave impact experiments alongside new open source CFD code development. We believe it is timely to focus the activities of the community on the development of opensource NWT code held within a central code repository (CCPForge). The code will be professionally software engineered and maintainable, tested and validated against measurement data provided by the partner experimentalists, whilst remaining sufficient flexibility to meet the requirements of all members of the WSI community. This model for sharing developments collaboratively within a consortium of partners within a central code repository that is sustainably managed for the future has been developed by the lead partners in related EPSRC funded research projects. The proposed CCP-WSI would extend the framework and methodology for sharing and future proofing EPSRC funded code developments in wave structure interaction to the wider community. This is proposed through a programme of community events and activities which are designed to foster the links between experimentalists and those performing computations, between industry users, academics and the interested public.
New Ideas in Gauge, String and Lattice Theory
Funding: £184,900
Funding agency: STFC
Investigators: Langfeld, K. A. (PI, Plymouth), Patella, A. (Co-I, Plymouth),
Professor Antonio Rago
(Co-I, Plymouth)
Start date: Oct 2014–Sep 2017
Abstract: The standard model of particle physics encodes our current knowledge of the fundamental constituents of atoms and the nature of matter in the earliest moments following the Big Bang. However, our understanding of the dynamics of the standard model is limited by our ability to solve its strongly-interacting sector, quantum chromodynamics (QCD), which describes the interactions of quarks and gluons. The Swansea and Plymouth groups are approaching this problem from two complementary perspectives. By approximating the continuum of spacetime as a discrete lattice of points, it is possible to simulate QCD on high performance computers. The groups will study lattice QCD in the extreme conditions of high temperature and density which existed following the Big Bang and which can now be realised in heavy-ion collisions at the Large Hadron Collider (LHC) at CERN. These investigations will be complemented by analytic insights arising from 'gauge-gravity duality', a remarkable principle which relates the theories describing particle physics with properties of general relativity.
The primary goal of the LHC is, however, to discover the new physics which is responsible for the generation of mass for the elementary particles. This 'electroweak symmetry breaking' is the least understood part of the standard model. It may be due to the existence of a background field permeating spacetime, which gives mass to particles as they interact with it. On the other hand, mass generation may be due to the existence of a new strong interaction at the TeV energy scale probed by the LHC. In both cases, the theories predict the existence of a new spin zero particle, the famous Higgs boson recently discovered at the LHC. Distinguishing these possibilities is a subtle problem and once again we are attempting to resolve the question using both gauge-gravity duality and lattice simulations.
Particle physicists do not, however, believe that the standard model is the ultimate theory of nature. It is an example of a gauge theory, a theoretical framework which unifies quantum mechanics and special relativity together with the fundamental symmetries which physicists have discovered through decades of experiments with particle accelerators. Meanwhile, gravity remains outside this framework, being described by general relativity in terms of the curvature of spacetime. A deeper unification appears possible with superstrings, which contain both gauge theories and gravity together with a new type of spacetime symmetry known as supersymmetry. The Swansea group is therefore complementing its investigations of LHC physics with research into the deeper structure of gauge fields and strings, using fundamental ideas such as gauge-gravity duality and 'quantum integrability' in the search for the underlying principles behind our current theories of particle physics.