GDFM: A New Workflow for Brownfield Rejuvenation

Exploration and production (E&P) activities in the oil and gas industry have been witnessing major technological leaps since the early 1990s. These leaps were helped along by the increase in computer processing capacity and speed and the availability of 3D visualization software. Both factors have allowed for large-scale data analysis, dynamic simulation, nodal analysis, and uncertainties assessments. Moreover, 3D seismic has led to the construction of more detailed geological models. However, after a long time of production, numerous oil fields become brownfields. Accordingly, the classic models applied to study the early-phase production in oil fields would not necessarily be successful in brownfields.

Thus, models were adjusted to be suitable for brownfield rejuvenation. One of these approaches is the Reserved Geo-Dynamic Field Modeling (GDFM), discussed in paper
“A Reserved Geo-Dynamic Approach for Brownfield Rejuvenation” by Mahmoud Ibrahim, Geoscience Manager, and Gregor Hollmann, Development Manager at Wintershall
(now Wintershall Dea). The paper was published by the Society of Petroleum Engineers (SPE) Reservoir Evaluation and Engineering in 2018.

Brownfield Features

The distinction between mature fields and brownfields needs to be kept in mind. A field is considered mature when production level declines by more than 50% of the plateau rate, and an incremental oil recovery (IOR) method was deployed to improve secondary reserves. However, a brownfield is a mature field with a production rate that is 35-40% less than the plateau rate and has depleted primary and secondary reserves. Moreover, brownfields rely on implementing enhanced oil recovery (EOR) to improve recovery within the field or within individual reservoir units.

Brownfield data acquisition changes through the field’s lifetime in terms of type and quantity. Based on that, static data acquisition reaches the apex (quantity and technical quality) during the plateau phase and then declines to satisfy correlation purposes during drilling stages. On the other hand, dynamic data acquisition grows and is recorded until the last day of the field.

There are some challenges to be faced while attempting to expand the reserves and production of brownfields. These challenges include narrow economics, as revamping brownfields mainly targets increasing hydrocarbon production and lower economic limits by shrinking decommissioning costs. In addition, brownfields are characterized by having enormous amounts of data throughout the levels of the field’s production, which is called the “data tsunami” phenomenon. This phenomenon exacerbates the identification of key influential factors of brownfield modeling. Another challenge is that the industry’s traditional silos, coupled with a skills shortage, negatively affect the expected results of brownfield analysis.

GDFM Concept

GDFM is a combined model that does not split data into static and dynamic to optimize the operational directives of brownfields for further development. Since dynamic data improves in quantity and granularity over the field’s lifetime, it makes understanding the reservoir easier. Therefore, GDFM begins with dynamic data. It is worth noting that
GDFM does not have a specific direction for implementation, such as top-down and
down-top; it instead works simultaneously.

The main goal of GDFM is to establish a reasonable model that follows approved processes for data evaluation and is standardized by full data integration to maximize the data value. There are four criteria for a reasonable model: contains few adjustable elements, shows minimal contradiction to existing data, explains most existing observations, and gives predictions about future observations that can disprove the model if they do not eventuate.

GDFM Process

The GDFM process begins with data integration, in which all static and dynamic data are collected, standardized, and quality-checked. Thus, the model is built through sharing and integrating the dynamic and static data. The process implementation requires working in cooperation with a brownfield evaluation team.

GDFM consists of many phases. Phase zero is scoping, in which several questions about brownfield study must be considered. These questions include: what is the purpose of the study? What is the current production break-even price? Is there sufficient data or is additional data acquisition advisable? Is a full-field model mandatory? Is there enough constraint guiding the porosity transform?

The most essential phase is phase one, which is intensive sharing. During which, all disciplines are engaged for a macroscopic field diagnosis that indicates the field’s key issues and key potentials. Moreover, this phase includes assessing the field’s status, integrating well cross-sections, calibrating the permeability transform dynamically, and identifying the compartmentalization, flow units, and well test reviews. The main deliverables of this phase are the validated model-input parameters for history matching and determining the field’s complexity.

Phase two is the population of the geo-dynamic model. In this phase, all static and dynamic data with geometric relevance is integrated. The final phase is history matching, dynamic simulation, and strategy definition. It is worth noting that the field’s rejuvenation does not have a separate phase, instead it unfolds from the GDFM.

For instance, a field in Germany was classified as a brownfield as its production level shrank to less than 25%. A high-resolution 3D of the field was acquired to resolve the field’s complex pattern. The seismic interpretation did not only use seismic and abundant well data, but also integrated the data calibrated in phase one. Hence, seismic integration enabled the introduction of field facts into structural mapping, as these facts are not always obvious in the seismic data. The implementation of the GDFM on this brownfield resulted in saving around 30% of the time budget.

GDFM Pitfalls

After many decades of data collection and production, uncertainty remains, which is related to the dynamic behavior of individual faults, water saturation, and relative permeabilities. However, combining available data leads to establishing the GDFM, which is the best option for studying brownfields.

When the model is implemented, the identification of the required and unrequired data has to be considered. Neglecting important data may result in false prediction and considering unrequired data will mean extra cost, time, and effort.

Industry Methods of Brownfield Evaluation

Several work flows were recorded in literature, studying field data integration, risk assessment, and field modeling. The majority of these work flows focus mainly on static modeling and history matching. However, they gave little attention to integrated brownfield studies.

Commonly, the direction to study brownfields still leans towards static or dynamic modeling as distinctive categories and not as a single entity, whereas, GDFM deals with data as
an integrated entity. Some other studies have tried to close the gap, such as an integrated approach presented in 2015 to define fan-reservoir architecture by including seismic attributes into the geological modeling.

Concluding Remarks

Although brownfields are in an advanced level of production, they still have large opportunities for evaluation because of existing infrastructure and facilities. Hence, smart brownfield management has the potential to remove the economic limit and increase the level of production and reserves. For that purpose, GDFM is a common work flow for brownfield rejuvenation, which takes a reversed approach by beginning with history matching and using dynamic/static data at a very early phase. Consequently, the GDFM maximizes the value of available data to best study brownfields.

Login

Welcome! Login in to your account

Remember me Lost your password?

Don't have account. Register

Lost Password

Register