Across Europe there are a large number of rock deformation laboratories, each which runs many experiments. Similarly
there are a large number of theoretical rock physicists who develop constitutive and computational models
both for rock deformation and changes in geophysical properties. Here we consider how to open up opportunities
for sharing experimental data in a way that is integrated with multiple hypothesis testing. We present a prototype
for a new forecasting model testing centre based on e-infrastructures for capturing and sharing data & models to
accelerate the Rock Physicists research.
This proposal is triggered by our work on data assimilation in the NERC EFFORT(Earthquake and Failure Forecasting
in Real Time) project, using data provided by the NERC CREEP 2 experimental project as a test case.
EFFORT is a multi-disciplinary collaboration between Geoscientists, Rock Physicists and Computer Scientist.
Brittle failure of the crust is likely to play a key role in controlling the timing of a range of geophysical hazards,
such as volcanic eruptions, yet the predictability of brittle failure is unknown. Our aim is to provide a facility for
developing and testing models to forecast brittle failure in experimental and natural data. Model testing is done in
real-time, verifiably prospective mode, in order to avoid selection biases that are possible in retrospective analyses.
The project will ultimately quantify the predictability of brittle failure, and how this predictability scales from
simple, controlled laboratory conditions to the complex, uncontrolled real world. Experimental data are collected
from controlled laboratory experiments which includes data from the UCL Laboratory and from Creep2 project
which will undertake experiments in a deep-sea laboratory. We illustrate the properties of the prototype testing
centre by streaming and analysing realistically noisy synthetic data, as an aid to generating and improving testing
methodologies in imperfect conditions. The forecasting model testing centre allows to:
1. Data transfer:
• Upload experimental data: We have developed FAST (Flexible Automated Streaming Transfer) tool
to upload data reliable and periodically from RP laboratories to the data repository. FAST allows for
setting up data transfer requirements and selects automatically the most suitable transfer protocol.
Metadata are created and stored in the data catalogue.
2. Web data access:
• Create synthetic data: The users can choose the generator and supply parameters. Each synthetic data
is automatically stored in the data repository which corresponding metadata in the data catalogue.
• Select data and models: The metadata of each data (synthetic or from laboratory) and models are welldescribed
through their respective catalogues accessible by the web portal. Searches over the metadata
• Upload models: The users can upload their models and them will be automatically stored in the models
repository. The users have the opportunity to share their models among other users. The web portal
creates and stores the metadata of each uploaded model into the models catalogue.
• Run models and visualise results: Once a user has selected data and a model, the model is submitted
to a High Performance Computational resource, hidden the complex interaction with clusters. Their
results are displayed in accelerated time. The results of the computations are stored allowing retrieval,
inspection and aggregation.
3. Metadata and Data Storage:
• Store automatically metadata in catalogues and data, models and results in repositories.
The forecasting model testing centre proposed could be integrated into EPOS context and its expected benefits are:
• Facilities the prediction of brittle failure and its scalability to natural phenomena.
• Provides fast testing and ideas propagation.
• Increases the impact and visibility of RP and GS research.
• Provides important resources for education and training.