For those who got flooded by Harvey or from “control release”, our thoughts and prayers are with you. As I was stuck at the house when Harvey was pouring down 51 inches of rain, surrounded by flooded streets and freeways, I thought about what would make this blog relevant to our members. I recall: a) the first time many machine and deep learning jargons appear daunting, b) after internal translation of machine learning speak to G&G experience, I can relate to much better. So here are three bullets getting started. 1) Model - our members may be thinking of 3D model (x, y, z, properties), or production type curve (time series). Machine learning models often come in as weighted sum of features. Translation - features are attributes (e.g., AVO, density, porosity, IPs, net pay). Think of machine learning model expressed as weight 1 * attribute 1 + weight 2 * attribute 2 + … + weight 9 * attribute 9 (and as many as we see fit). 2) Supervised and Unsupervised Learning - for example, in supervised learning, we have clearly labelled attributes (AVO, density, IP, etc.) and outcomes (e.g., gusher, dud or a dry hole). The key is to determine the weights that will accurately predict a selected feature (attribute) and consistently perform well on new data. When it comes to unsupervised learning, like feeding 3D seismic images to the algorithm and let it figure out where the reservoirs are. Example is Google Brain that learns about cats by “watching” millions of YouTube videos. (Neural network is an algorithm that is really good at exploring the underlying representation of data. The trick is to figure out the number of layers and number of nodes within each layer.) 3) Reading on Deep Learning without Fear of Equations, Greek and Geek Overload - highly recommend “The Master Algorithm” by Pedro Domingos. It can help us frame discussion on further exploring machine and deep learning and think about applying them to what we do to make a difference. (Prof Domingos’ book reminds me of “The Thinking Machine”, by Daniel Hillis for his PhD thesis at MIT, written in plain English and free of computer science jargons so readers in other disciplines can read and get it. Something we shall aspire to do in this blog to accelerate shared learning and extend reach.) p.s. "The Master Algorithm" - for fun, check your local library.
|