Recipe for British Xmas Dinner: Prepare roast turkey, roast potatoes, boiled potatoes, steamed carrots and broccoli, red-currant & port-wine jelly, and chestnut stuffing. Arrange beautifully on warmed plates. Serve with gravy and sparkling wine.
Recipe for UPSCALED British Xmas Dinner: Prepare as above. Tip food, gravy and wine into blender. Pulse for 5 seconds. Pour into cold soup bowls.
You get the picture? Then you'll have my view on upscaling.
In short, we don't need upscaling. It is an unnecessary step that does nothing but smooths data and destroys information. Regardless of the upscaling method chosen the result is always a backwards step.
Put into the language of mathematics, the process of upscaling (Up) can be described with respect to the 3D region of the reservoir, the reservoir property being upscaled and the upscaling method used:
Where P is the big pot, Po is the porridge and Sw is the big spoon (wooden) with which we stir the pot ... sorry, I couldn't help myself ... joking yes, but only half-joking because just to pre-empt any comments of the form: "actually, some upscaling methods are very robust", remember, with upscaling methods we are only discussing (Sw) the type of wooden-spoon being used.
STEAMIN': I'm going to play the longevity card here: I've been building grids and 3D models for E&P companies for longer than it has been routine. I was an "early-adopter". Back in the good old days (which weren't actually all that good) upscaling was required. Even a modest geomodel required more cells than simulators could deal with, and so upscaling was invented. Papers on upscaling were written, software incorporated upscaling methods, upscaling experts appeared. But with Moore's law and the passage of time we don't need to do that anymore. We really, really shouldn't do that anymore.
Example from the 10th SPE Comparative Solution Study: Upper Ness Formation (Fluvial Environment). From L to R - Fine grid and 3 coarse grids.
WORKIN': So how do we work ourselves away from our addiction to upscaling? The rule of thumb in early 2000s used to be max 200,000 cells which was useful not because it was right but because it instilled discipline. I work using my own 2 principles; The Orwellian Principle and the Sweet Shop Principle.
Orwellian: "all cells are equal but some cells are more equal than others". Once every 1000 years an RE will appear who will try to pass the blame onto the geologist (no, no, really, it does happen sometimes), and state that the problem is simply that there are too many cells. But cell count, though important, is not everything. Collapsed cells, distorted cells, NNCs, low NPV cells all slow down simulations. So do poor lift curves and PVT tables. It's quality that's important, not quantity, just like Xmas dinners.
Sweet-Shop: I'm a geologist so I can say this, give a geologist enough rope and they'll macrame a 1:1 scale-model of the reservoir. When it comes to "bells and whistles" (tech-term = functionality) the average geologist is like a Coke-Cola (Irn Bru for those of you in Aberdeen) infused kid in a sweet-shop. Sometimes, as unpopular as it may be, the geologist just needs to be slapped on the wrist, told "you can have just one lollipop and one candy-bar", and then marched out into the street and the blinding glare of reality.
RELAXIN': What's the answer? Take it easy. Decide at the beginning as a Team what you're doing and focus attention on delivering. Years ago BG in Reading UK used to have a fantastic set of documents about framing modelling projects that did exactly that. 2D models are good. Sector models are great. What's driving IJ increment (squared effect on cell count) and K increment? What do you really need to capture? Are you building a model to simulate or to see how many cells it takes to break the software (btw: I know the answer). When it comes to cell count; Less really is More ... sorry, Fewer ... my English teaching mother will be cursing in her grave.
THE MUSINGS OF ... I'll say it again; we don't need to upscale and we should not be upscaling. It does nothing but smooth and destroy information. And yet it is still routinely done by many companies and individuals. In-fact, I'm going to bet that some folk will still be upscaling their models in 10 years time whilst working on grid-less/node-less simulators running on quantum computers ...
... oh, hang-on, the mathematics and coding for grid-less/node-less simulators is already in place, so 5 years time ...
... oh, wait, quantum computing is already commercially available, so 2 years time ...
You get the picture.