Nodal systems are revolutionizing the seismic acquisition: there are no limits on the geometry layouts that can be easily deployed at a record speed.

This shift is not only stimulating the appetite of survey designers and affecting the logistic aspect of the seismic survey: it also has a huge impact on the way the seismic records are retrieved.

Nodes are designed to stay out in the field for days, continuously recording not only the subsurface response to the seismic sources but also a lot of environmental noise. 

Then nodes are recovered, data and noise from multiple days of acquisition are dumped and receivers are redeployed.

Data dumping produces a tsunami of seismic records whose sanity and quality must be assessed quickly and thoroughly in order to ensure the operational efficiency, to avoid costly re-shooting, to minimize the duration of the campaign and, subsequently, the HSE exposure.

Being able to detect any system anomaly of malfunction as early as possible is absolutely critical for the successful accomplishment of the operations.

The problem is that “the system” is made up by tens, when not hundreds, of thousands of nodes continuously interacting with a fleet of vibrators and with the surrounding environment…

The likelihood of “something wrong” happening “somewhere” is extremely high.

How many eyes are needed to supervise the survey and ensure the operational efficiency and the best data quality possible? 

The answer is not 42 and not even 42 thousands… the answer is SurvEyes.

SurvEyes is a system capable of digesting the huge stream of data that is generated daily on the field, to automatically analyze it, detect anything that could have go wrong.

The final product is a report in form of web dashboard thus accessible from virtually anywhere with an internet connection.



Rotterdam Harbor. (Source: USGS)

The possibility that some human activity could trigger or induce seismicity is something known since a century but, over the last few years, this fact has seen an increased attention.

The safety of big infrastructures in seismic active regions depends both on the understanding of the seismic risks in the design stage and on the constant monitoring of the seismicity after the completion.

In order to achieve the carbon neutrality the political authorities of heavily industrially-developed and densely populated regions are betting big on the carbon capture and storage (CCUS) technologies and, where possible, on the exploitation of the geothermal resources.

Also these types of projects need a thorough assessment and monitoring of the seismic activity.

The monitoring is implemented through the deployment of a network of sensors sensors that “listen” and record continuously the vibrations of the subsurface.

During most of the time those sensor record just noise but the detection of anomalous amplitudes at multiple locations triggers the automated event localization – where the event originated – and characterization – how strong the event was.

The real-time event localization and characterization provides timely and valuable information for assessing the risk of the ongoing project and operations.

We designed and implemented a modern application that answer all the needs of a seismicity monitoring process. 

The tools for designing the sensors’ network, the utilities for monitoring of the health status of the network and the advanced algorithms for the event localization and characterization are accessible from a single place.

The application is complemented with the all the essential utilities for the management of the seismic event catalog and the dashboards for facilitating the communication with all the different kinds of stakeholders.