Here’s a central truth. Geology is the single most important aspect of resource estimation. It beats geostatistics. It defines sampling requirements. It control sample quality.

Get the geology wrong and the estimate will be wrong. As simple as that.

Why then do we spend less and less time and effort on understanding the geology of our deposits and effectively translating that geology into reliable estimates? The literature is full of examples. Seemingly obvious differences between geological observations and the resource model/estimate.

Let’s redress that problem.

Once you’ve drilled a lot of holes and logged them (hopefully robustly), it’s time to get set for interpretation. Actually… that’s almost totally wrong! It’s not a linear process. Your drilling, logging, hole planning and interpretation happen in parallel with each task reinforcing and improving the other. It’s a holistic process. If you leave the interpretation until the program is finished and all the assays are available you are wasting an opportunity.

The secret here is the scientific principle of falsification. Develop a theory (i.e. a model) then test that theory (with a drill hole). By combining the interpretation and data collection phases you can focus the drill program on the areas of least certainty and/or greatest variability. That’s a vast improvement over simple pattern-based drill programs.

Managing a drilling program and the geological interpretation of that program requires a bit of thought. You need to understand the purpose of the program and interpretation. The ‘why’ frames the problem and the solution. Of course there’s a trap here too… the reason can and does change over time and while you may not think so, that can impact on the nature and quality of the interpretation. With that in mind however, there are some basics about interpretation that every geologist (or engineer or metallurgist) should understand.

First and foremost there are multiple layers of interpretations. There’s the structural framework, the lithology, the alteration. Then there’s the mineralisation, the geometallurgy, the geotechnical. And the list goes on. While each of these are of a necessity related they are not the same. Long gone are the days of one-size-fits-all. Take the complete set and that’s GEOLOGY. It’s an integrated picture. No single layer can stand apart.

So where do you start?

Start with the data. The first rule is – check the data. The second rule is – check the data.

Bad data is the cause of innumerable interpretation errors. As humans we are biased to believe and we have the ability to make stories that match our world view. I’ve lost track of the number of times some strange looking geology has been predicated on one or two data points – the entire model skewed to fit the data – only to find that those data points were incorrect.

By checking the data I mean REALLY checking the data. Recall my last post (how to log a drill hole) and the difference between fact logging and the classification of those facts into lithologies or other units. You can’t simply accept that the classification step in the logging process is ok. You need to check. My preference is to go back to the core (harder with chips). You need to get down and dirty with the data.

Have multiple drill holes laid out. If you can, select holes that are in the same area (dare I say it… the same section?) By looking at adjacent drill holes you can compare units, contacts, structures. Can you trace a structure from one hole to the next? Does the contact between units always look the same or does it change from hole to hole? If it changes, what is the nature of that change? Think back to all that geological theory you learnt at university and try to conceptualise the processes in action when the rock mass formed.

This is more than just data checking isn’t it? Now you are starting to validate some of those underlying assumptions you formed when logging individual holes. You are now interpreting the complete spatial context of all the holes together – as a set of observations, not individual samples. It’s amazing what a difference that makes.

It’s this sort of validation, checking and thinking that changes the gestalt. You can eliminate data and classification errors (don’t fit the data to your preconceived model). You can construct a 3D view of the world in your mind – making it that much easier to translate that model into digital space.

I have many memories of time spent in core farms walking from hole to hole, often with a roll of flagging tape in hand, marking like contacts, structures and units. In hindsight this was frequently the most value adding part of my work.

If you don’t have access to the drill core, try using core photos. If you don’t have core photos at least pull out the original logging. Check who logged the hole and when. Check their definitions against any other geologists working on the same program – try and confirm consistency in definitions.


After you’ve reviewed, validated and confirmed the data it’s time to start seriously thinking about that interpretation. Now there’s a lot of talk these days about the good, bad and downright ugly of implicit models. There’s also a lot of discussion about the limitations of sectional interpretation. Personally, I think either method can work – it all depends on the quality of the effort going into the model.

There’s a misnomer here. Packages like Leapfrog should more correctly be considered ‘semi-implicit modellers’

Take a manual (paper!) interpretation. I’ve seen excellence and I’ve seen sloppy. Someone who can think in 3D can create a fit-for-purpose and robust model with the most basic tools. Give that same person the latest software and they may fluff the output simply through a lack of understanding.

What really separates manual interpretation (either on paper or manual digitising in 3D) from the semi-implicit methods is speed and objective repeatability. You can potentially complete the final stage of interpretation (making a model) much faster using implicit techniques. That also allows you to more fully explore degrees of freedom and alternative world views, something that is more challenging with manual methods.

An anecdote from the dim past… When I was working at Mt Isa, the management consultancy Proudfoot PLC were engaged to review site-wide productivity. One morning I was the target of one of their assessments. The consultant approached me at the start of the shift and asked what I had planned for the day. “I’m going to interpret the geology on this set of cross-sections” I responded. The consultant took notes then disappeared until the end of the day. I worked diligently on those sections. Working back and forth between sections, projecting drill holes on and off section, switching to plan and so on. About an hour before knock-off I came to the conclusion that my interpretation was flawed. I didn’t understand the underlying structural framework and my model was simply wrong. I erased every section (remember pencils and erasers?) No sooner had I erased everything than that Proudfoot consultant came back in and asked “So… how did you go with the interpretation? Did you finish?”. He could see the blank sheets (A0) in front of me. I explained “I’ve done the hard work today – I’ve been exploring possibilities. Should get things finished tomorrow.”

It’s that sort of thinking space that adds value. Not blindly pushing buttons while staring at a computer screen.

The (semi) implicit modelling approach has some of its own strengths and weaknesses. We shouldn’t think of these tools as something that ‘eliminates the geologist’. The two aspects I like the most are the repeatability and the internal consistency of the models (for some, not all packages). I do miss the ability to show an evolution of thinking by incorporating sketches, notes and photos (there are some emerging tools for this).

To be frank, an implicit model and a manual model (should) take the same amount of thought. Shortcuts in either result in poor results.

If you’ve navigated the choice of interpretation tools (maybe some combination of approaches?) and you are certain the data and classification of that data is robust and pitched to the end purpose of your interpretation, it’s time to start making shapes and adding points. Underlying both the manual and implicit approaches is the geologist’s ability to lock down spatial vectors that define aspects of the geology.

The very first geological aspect should be structure. Structure controls or influences the vast majority of other geological features. Most mineralised systems (particularly base and precious metals) are related to some structural feature. Structure controls fluid paths and traps. Structure affects location, orientation and dimension.

Here’s where the first stumbling block arises. You need to have a good handle on structural geology and the evolution of multiphase structural events to develop a sound structural framework. Historically this is a poorly developed tool in the geologist’s toolbox. We can’t all recognise strain patterns or understand the stress field they resulted from. In the wrong hands developing a structural framework can lead to geometries and relationships that are all but impossible.

My advice? It’s worthwhile employing a specialist structural geologist to help with these initial interpretation steps. Choose someone familiar with the area on a regional scale.

Next steps

With your structural model in mind you can start creating the 3D shapes; typically planes for faults, and form lines for structural fabrics (lineations, cleavage, bedding and the like). I like to set up the faults first where possible. They form the limits for the rest of the interpretation in one way or another.

Interpreting lithology, alteration, mineralisation all follow a similar approach. Using the structure and the data work through unit by unit taking care to maintain the relationships between each unit. This is where all that hard work in the core farm comes into its own. You should have a sound understanding of overprinting or layering. You should know about unconformities. The form lines from your structural model offer a guide to general trends.

It’s worth noting here that now is the time to reconfirm your logging classes. Check your model as it progresses. If something looks strange, if something is inconsistent with your geological knowledge and understanding, investigate. Don’t simply ignore the data without a valid reason. Don’t include the data without a valid reason. Maybe the hole is in the wrong place. Maybe there’s been a transcription error or the collar coordinate have been swapped.

My favourite ‘challenge’ holes at a shallow angle to the surface you are trying to interpret. It doesn’t take a great deal of change in the hole location to dramatically change the surface-hole relationship and positioning.

If all goes well, you will now have a robust and geologically sound interpretation. That’s not where the story stops.

We need to represent the mineralisation. That’s a challenge in itself. Every different ore body will have its own unique mineralisation signature. Yes, we can group mineralisation styles and commodity similarities but at the scale we need to understand each system will be unique.


I can’t cover all aspects of defining mineralisation domains in this single post. There will be more on this topic. However…. Here are some points to consider.

1. Understand the paragenesis of your deposit. It will help.
2. Check the correlation or lack of correlation between different variables you want to interpret. If there is a strong correlation make use of it when forming your domains. Equally if there’s no statistical correlation I’d strongly advise checking any assumption of spatial correlation.
3. Look for the controls. The fluid paths. The traps. As one of my old structural geology friends used to say “show me an ore body and I’ll show you a breccia”. There’s a lot of truth in that concept.
4. Understand orientation. There are a number of ways to do this. You are looking for the directions of maximum and minimum spatial continuity. At its simplest, you can get a feel for plunge from a long projection. Or if you are happier with computer visualisation pull up your data, colour the top 5% or so in a hot colour and the rest in cool colours then spin it around until those high values line up – but beware… there may be more than one trend direction. It’s also worthwhile doing some preliminary variography to find directions of maximum/minimum continuity. Please… don’t just assume the lithological strike/dip and the mineralisation strike/dip are the same. They may be but test and validate.
5. Don’t let preconceived ideas about cut-off (economic or ‘geological’) colour your ideas. For the majority of deposits there is no magic number. Geology doesn’t adhere to economics. Remember if you interpret at a cut-off any model you create will exaggerate that interpretation.
6. Look at where there is no mineralisation as much as where there is mineralisation. Sometimes it’s easier to do an inverse interpretation.

Final thoughts

This is a huge topic. There are a wide range of experiences and opinions. Regardless the objective remains the same… we want to construct a fit-for-purpose representation of the real world that is an accurate predictor. Thankfully we can test our models (albeit at some cost) through drilling and (possibly) reconciliation.

So, jump in and make a start but think about what you are doing. Start with the data and classification of that data in a holistic sense. Use the tool that suits your needs and skills. And… avoid those default software settings. I’ve yet to see a ‘default’ ore body. It’s all about geology.

Leave a Reply

Your email address will not be published.