Parallel Solution: Computing muscle helping academics and the oilpatch find better ways to produce heavy crude


The oilsands are in a pickle. For the last decade, Canada’s largest source of hydrocarbon production has relied primarily on in situ, steam assisted gravity drainage (SAGD) for its growth. Since the plunge in world oil prices, however, many of the proposed future projects have been placed in doubt, with operators struggling to find more economical ways of extracting the viscous crude.

“With oil at $50, we are in post-SAGD territory,” says Zhangxing (John) Chen. “We need to revolutionize the way we extract bitumen in order to be competitive with other hydrocarbon sources.”

VISUAL AID Researchers at the University of Calgary have access to a visualization centre where high-resolution displays aid in the research of complex problems, such as sorting out the plethora of variables involved in the production of oil from unconventional reservoirs. (PHOTO: UNIVERSITY OF CALGARY)

Chen is a professor in the Department of Chemical and Petroleum Engineering at the University of Calgary, and a world leader in the study of heavy oil. After receiving his initial training as a mathematician in China, he focused on numerical reservoir simulations for Chinese energy giants CNCP, parent of PetroChina, and CNOOC Group. His studies took him to the U.S., where the obtained his PhD from Purdue University in 1991.

Chen currently holds the NSERC/AI-EES/Foundation CMG Industrial Research Chair in Reservoir Simulation and AITF (iCORE) Industrial Chair in Reservoir Modeling, and is director, iCentre for Simulation & Visualization. He has helped solve problems for ExxonMobil in the U.S., PDVSA in Venezuela and heavy oil and bitumen operators in Canada. “Suncor used our simulators to increase production and lower the steam/oil ratio in their Firebag SAGD project,” he notes.

Chen’s research involves many different academic aspects, from the theoretical creation of mathematical models to the formulation of physical reservoir problems, but the real meat of his work is developing the software codes that allow highly sophisticated reservoir simulation.

Conventional reservoirs in porous sandstones and carbonates are relatively simple to model. Unconventional reservoirs, such as the oilsands, have a plethora of variables, including complex geometries and multi-phase flow (oil, gas, water and sand). When you add in steam extraction mechanisms and chemical additives, then the task of modelling the reservoir in a meaningful way requires the gargantuan power of a massively parallel processing (MPP) network.

“When you have a massively parallel computing network, you can run larger databases with greater geological and production details,” says Chen. “You can solve more phenomena and increase your accuracy of your calculation of resources. You can therefore increase recovery.”

MPPs use thousands of separate processors to perform a set of coordinated computations simultaneously. “The speed of calculation is directly related to the number of CPUs,” says Chen. “If you have one CPU and your calculation takes one hour, then it will take only half an hour if you have two CPUs. Many of our calculations can take 10 days on a single CPU. With 8,000 CPUs, it only takes a matter of minutes.”

Not surprisingly, MPPs are not thick on the ground. Upfront capital expenditures for such systems are in the order of millions of dollars, and the operating expenses for support staff, electrical power and associated infrastructure can vex even the biggest university budgets.

 Using access to IBM’s high-performance computing platform, researchers can model various production scenarios in a matter of minutes that previously would have taken days to calculate. (PHOTO: UNIVERSITY OF CALGARY)

Enter IBM. For the last 10 years, the IBM Alberta Centre for Advanced Studies (CAS) has offered Alberta universities’ health, environment and natural resource researchers free access to their high performance computing (HPC) platform. Ulisses Mello is director of IBM Research - Brazil. He has a PhD in Geology, and is currently in Canada working with academics to assess and collaborate on big data challenges.

“IBM has a corporate approach in which we build university relationships on several levels,” says Mello. “We do collaborative work with the best research groups, and we also look for trainees and recruitment. We also have consortiums that include universities and companies that form ecosystems that are committed to broader goals.”

IBM is working with Chen to understand the computational requirements for reservoir simulation of heavy oil. “It is a highly non-linear task, and we are working on ways to better configure the platform,” says Mello. “We are very happy to learn his needs, because it leads to better designs and technologies—it’s a two-way street.”

IBM leverages that learning with its corporate clients. “People hear ‘IBM’ and think of hardware, which is no longer strictly the case,” says Mello. “We are no longer just a supplier of technology; we are involved with strategic initiatives, industrial clouds and much more.” The Alberta CAS mimics the new computing model; it has no bricks-and-mortar campus, or even a dedicated box covered in blinking lights. “It is a virtual centre, operating in our HPC industrial cloud,” says Mello.

For Chen, access to the HPC allows his 50 graduate students and 10 post-docs to improve their modelling in a number of ways. “The collaboration with IBM allows us to increase accuracy, speed and robustness,” he says. “Robustness is the ability to add more physics and chemistry into the simulation; it allows for greater number of variables and different situations, and makes your simulations more realistic.”

The vast computing power also allows academics to experiment with immersive visualization. The University of Calgary has a “cave” where researchers like Mario Costa-Sousa and his students can project information in a manner that allows for the accurate and efficient determination of every stage of a reservoir’s exploration and production cycle. “The main goal is to provide expert users with a more integrated visual analysis environment so they can interact, manipulate, explore and gain new insights,” he noted in a recent interview.

Having access to powerful computing capabilities is especially critical for an oilpatch being challenged by a rock-bottom commodity cycle. “We are looking at new recovery process ideas that can cut costs for shale gas, shale oil and oilsands,” says Chen. “We model simulations in which we add solvents, surfactants, chemicals and additives such as propane and butane. We are also looking at electric heating, magnetic heating and catalysts to see what works best. With the scale of the oilsands, even one per cent increase in efficiency or recovery can make a huge difference.”

Over the last 10 years, Alberta researchers have successfully completed dozens of projects, generating real-world value. What will the next 10 years hold for Chen? “The environment can benefit tremendously from new technologies,” he notes. “We can work to lower water usage, natural gas usage and greenhouse gas emissions. Our target is to reduce energy usage by 90 per cent.”

For IBM, cognitive computing holds special potential. “We are developing OpenPOWER, a new platform and way to configure chips,” says Mello. “We have a new chip called SyNAPSE that is designed to simulate neural networks and have cognitive abilities, such as voice recognition. It can process data right on the spot prior to transmission.”

ENIGMA SOLVED Production from oilsands SAGD facilities, like Suncor Energy’s MacKay River operations, is challenging to model due to the numerous variables involved. The University of Calgary turned to the power of a massively parallel processing network to solve the problem. (PHOTO: JOEY PODLUBNY)

New chips, algorithms and applications are also going to benefit knowledge management. “Right now, there are several reservoir simulation models for heavy oil, each of which are best for particular reservoirs,” says Mello. “Knowing which is best takes a lot of experience. We will be working with experts to capture best practices to help manage the complexity of data and the multitude of analytical options.”

Chen hopes that his group’s work will help engineers focus on the most important reservoir variables, as well as eventually lead the way to off-the-shelf apps that can draw on far larger parallel computing systems.

IBM expects that new technology will aid geoscientists. “Adding value to seismic data involves two equally important components; computations and interpretations,” says Mello. “You can work with algorithms and computing power to increase computational capability, but interpretation relies heavily on human experts, and that’s not easily scalable. Knowledge management will become increasingly important.”

In the meantime, Chen and his team will continue to plug away at what they do best—with a little help from their friends. “There are a lot of massive parallel clusters available in Canada to academics,” says Chen. “But IBM offers a preference for energy research, and has really good scalability.”

By Gordon Cope


Zhangxing Chen, University of Calgary, Tel: 403-220-7825, Email: [email protected]


You are here: Home Arrow Internet of Things Arrow Parallel Solution: Computing muscle helping academics and the oilpatch find better ways to produce heavy crude

  • Twitter Feed

  • Blog

New Twitter Updates

msmith BP sees technology nearly doubling world energy resources by 2050 via @Reuters
Less than a minute ago
msmith Uncovering the secrets of ice that burns
Less than a minute ago
msmith RT @TheCdnAcadofEng: Eddy Isaacs, CAE Board member & CEO of @ABInnovates Energy & Environment Solutions describes a future for oil. https:/…
Less than a minute ago
msmith GE Announces New Collaborations And Technology Centres To Drive Digital Industrial Transformation oilandgas IoT
Less than a minute ago
msmith OSP Helps Pioneer Energy Services With Coiled Tubing Bacterial Issue In The Eagle Ford oilandgas
Less than a minute ago