Neil McMahon at Bernstein Research says it’s not peak oil, but peak well flow that is the problem. He does however touch on similar themes advanced by peak oilists; namely that remaining oil reserves are becoming more difficult, and expensive, to recover. He recounts the phenomenal flow rates of some of the most famous ‘gushers’ of bygone days, notably the Lucas I well at Spindletop, Texas.
Finds such as this were ‘drilled to death’ and flow rates fell, especially as pressure subsided. High flow rates had something of a renaissance thanks to the North Sea and other offshore discoveries of the 1980s and 1990s, and McMahon writes that these flow rates were critical for the expensive and difficult engineering required to develop the fields.
Today, however, he writes that the industry is not coming to terms with the lower flow rates that newer discoveries will provide.
We have been so used to field developments that have produced 30 kbbl/d, or even 40 kbbl/d in the case of Thunder Horse, that the new phase of developments are likely to come out at half that number. So why will these flowrates drop by half, and won’t technology come along to save the day yet again?
The pre salt fields off Brazil, deep water reserves in the Gulf of Mexico and oil sands might be the industry’s next big hopes (he omits Iraq), and while the industry has historically dealt with engineering challenges, “the problem is harder” because:
(in) the past the reservoir conditions and system were not the problem, but for the new developments they are the problem, which means that technology will have to overcome the laws of physics to improve the flow rates from these new discoveries. This will take time, and it is likely that using a quick rule of thumb, the development and operating costs of these new fields will be treble those of the legacy fields on a per barrel basis, if the flow rate is half…because flow-rates really do matter a lot.