SYMPLE proposes a School of Hydrogeological Modelling hosted by professionals and academics. The course will blend theory with practice in ways that our unique industry requires. Such courses are comparatively rare and urgently needed.
The traditional workflow begins with field investigations, proceeds to the building of a conceptual model, and then seeks to build an often too complex numerical model on that basis. The underlying principle of this old approach is that we can deterministically simulate what happens under the ground. This is a demonstrably erroneous premise. The “traditional” method has been leading us to widespread distrust of what modelling can achieve
SYMPLE intends to promote and facilitate the understanding, use and evaluation of hydrogeological numerical models through a multidisciplinary program (the School) associated with the use of strategies aimed at solving specific problems (project-related strategy).
SYMPLE intends to teach an emerging paradigm, supported by latest available ideas and software for data assimilation, of “starting from the problem and working backwards“. This workflow consists of firstly identifying the type of data that has the greatest capacity to reduce the uncertainties associated with decision-critical predictions of system behaviour, and then designing a numerical simulation strategy that serves the decision-support imperative of actually quantifying and reducing those uncertainties.
Development of better strategies to address existing and pressing problems just requires the same data and software mostly already available (PEST and PEST++ suites), but a new mindset. And in many cases the modelling will be quicker and less expensive because it is:
- management targeted;
- no more complex than it needs to be to serve the decision-support demands;
- supported by project-related strategies with associated specific software.
That is, modelling will be complex enough to assimilate data and reduce uncertainty, but strategically simple because it is decision-focused.
Hydrogeology is, by definition, an interdisciplinary subject. It accounts for the chemical, physical, biological and legal interactions between nature and society. The term “complex” is often used in hydrogeological narrations. Complexity comes from the geological setting, the time evolution of the system and its dynamic response to stresses, with an uncountable number of variables to bear in mind at the same time. To this end, numerical models are a privileged tool and naturally became commonplace in any decision-making process.
The spread of the numerical approach is, by the way, facing an intrinsic contradiction: models are often asked questions that cannot be answered. Present modelling culture tends to believe that models, especially if very complex, can solve whatever problem providing the “right answer”. This way of thinking ignores the fact that a model will never be a perfect simulator of reality able to predict the future. What a model can do is: (A) to encapsulate what is known; (B) tell what cannot happen in the future on that basis.
Concerning point (A), the definition of “what is known” is not trivial. It is often thought that a model built on whatever data (taken in a handful of locations) can come to autonomous life and simulate all the aspects of reality we need to enquire. Groundwater behaviour is determined by aquifer and aquitard properties, which are extrapolated from the analysis of field measurements. The position, quality, amount and interpretation of collected data define the boundaries of the information that can acquaint our “expert knowledge”, i.e. “what is known”. “Data” should include hard and soft information, anomalies interpretation, ancillary information, qualitative narrations, etc., analysed in a consilience framework (from Latin com– “together” and –siliens “jumping”). The same data can be processed “as usual” or maximizing their information potential according to the principle that evidence from independent, unrelated sources can “jump together” to the same conclusion.
Concerning point (B), numerical models are imperfect simulators. Nevertheless, they can support the decision-making process, if they are able to extract information from all available sources in order to quantify and constrain the uncertainties of decision-critical predictions. The numerical model can be developed “as usual” or it can iteratively continue the information extraction from the data, informing on its worth, guiding in its supplementary collection, testing conceptual model hypothesis.
Disappointment and criticism born of frequent model failure is growing and the need of a change in the current modelling workflow is urgent. Many papers, seminars and initiatives, such as the ongoing Groundwater Modelling Decision Support Initiative (GMDSI), are focussed on improving the role that groundwater modelling plays in supporting environmental management and decision-making.
The 2019 Darcy Lecture by John Doherty was entitled “Start from the problem and Work backwards” proposing the new paradigm of firstly identifying the type of data that has the greatest capacity to reduce the uncertainties associated with decision-critical predictions of system behaviour, and then designing a numerical simulation strategy that serves the decision-support imperative of actually quantifying and reducing those uncertainties.
SYMPLE intends to join and implement the paradigm shift in synergy with other intellectual, professional, industrial initiatives with similar or complementary aims.