But I don’t WANT to go to work!!

I think anyone who has ever completed post secondary education will understand the steep learning curve that exists to move from the classroom atmosphere to that of the workplace. No more sleeping in till 11am and donning wrinkled sweats and a t-shirt to the classroom. No more books to consult to find exactly the answer you are looking for, or professors collecting homework assignments to make sure you did them. I mean who in their right mind would trade attending class a few hours per day and socializing with like-minded peers for sitting in rush hour traffic (or on a train that more often than not has “electrical” issues – if you live in Calgary) at 7:00am, working till 5pm and being so tired by the end of the day all you can do is go home and sleep? But I suppose the transition is necessary, we can’t live in Mom and Dad’s basement and go to school forever, and it costs money to live on your own – so corporate world, here we come!

It’s certainly a tough transition, one that takes time to get used to. While you may acquire knowledge about the things you will be up against in the workplace, you will seldom experience exactly the situations you will be up against in the workplace – when your employment, reputation and salary are on the line.

One thing I am proud to say I get to be a part of is helping university students make the transition. How you ask? Well, by providing them direct access to the tools that they will encounter in the workplace, live and with the same training and support our corporate clients are accustomed to.

Since 2001, we have donated nearly $40 million dollars worth of software and data to educational institutions across Canada and the United States, to help make the transition from school to work a little less daunting, as well as to make students more readily employable.

We were recently provided the honor of entrance to the University of Western Canada’s prestigious 1878 Societies, reserved for donors making notable cumulative donations. For more on this honor please view the geoLOGIC and UNIVERSITY OF WESTERN ONTARIO Press Release.

We believe our company’s success comes from giving back to the community and supporting the development of individuals entering into our industry, and as such are committed to providing them with all of the tools we have to make their transition as seamless and successful as possible. We are proud to be able to give back in this manner.

If you would like to enquire about having geoSCOUT made available at your institution, please contact us at sales@geologic.com and we would be happy to get in touch with your school.

Had a good (or bad) experience with geoSCOUT or geoLOGIC at your university? Tell us about it!

Take care, and happy learning!

Posted in Corporate Social Responsibility, Customer Education, Customer Support, geoLOGIC, geology, Industry Best Practices, mapping, Oil & Gas, oil and gas, Oil and Gas technology, petroleum, Public relations, software, Technology | Tagged , , , , , , , , , ,

Data Quality, what does it mean to You?

Data Quality, what does it mean to you?
To me, it means that I can implicitly trust what I see whether it be a well ticket or a boat’s gas gauge (and E does not mean Enjoy your ski, we have lots of gas and YOU won’t get stuck out in the middle of the lake but that is another story…) or that the result set from a query for wells drilled year over year by producing zone that show an uptake in Triassic based plays can be trusted. So that is my definition of data quality but how do we get there? Or maybe the question is how do we avoid becoming a data quality folk tale? This website (http://www.iqtrainwrecks.com) has some amusing stories and this one (http://www.iqtrainwrecks.com/2011/03/17/gas-byproducts-give-pain-gut) strikes very close to home.
In my years of working with oil and gas data, it seems to me that data degrades very easily but it is only through hard work and consistent effort that you can get data to stay the same or even improve, year after year. Part of this comes from our own built in avoidance of change, part of it is based on company culture. I found this article (http://www.forbes.com/sites/forbeswomanfiles/2011/09/15/the-remarkable-edge-a-breakthrough-environment-will-give-your-company) most interesting because it states that there are basic inputs that will help your company grow and thrive in this new and challenging century. To take it a step further, the basic tenants of data quality (Full, Accurate, Consistent and Timely data) can be tied to four inputs plus one to sum it up.

  1.  Speed –> Timely data; it’s no good to know about a competitor well being abandoned after you have started drilling your own. You need to have that information when it comes off confidentiality and you need to trust that it will be there.
  2.  Reliability –> Consistent data; now you see me, now you don’t. Great game if you are playing it as a 3 year old; not so much if you are an oil & gas knowledge worker.
  3.  Quality –> Accurate data; we have all heard the folk tale of G&G staff spending anywhere from 1/3 to 2/3 of their time validating or finding data. Well, when you have an inherent built-in quality or trust level, then you move mentally from challenging the data to incorporating the data into the play.
  4.  Engagement –> Full data; when a play is being worked on, the G&G staff must know that all components are available. Logs, Cores, Analysis, Tops, Tests, IP, Reserves, Seismic, Pipelines, etc.
  5.  Innovation –> FACT based data; having all of the facts and then being able to challenge the status quo with sound information that is able to show opportunity.

Okay, we have defined a basic tenant around building up data quality. What’s the next step? Well, some review of existing websites is never a bad thing. It’s always easier to build on the shoulders of others than start from scratch. Sites or blogs that I like are:

  •  http://dataroundtable.com –> a series of blog postings from IQ or information quality thought leaders
  •  http://tensteps.gfalls.com –> a great site that has digital examples of what’s in her book, Executing Data Quality Projects. I have a copy of this book and it’s getting dog-eared. ‘Nuff said.
  •  http://data-governance.blogspot.com –> Anything by Steve Sarsfield is gold. His latest articles on the root causes of data quality are great. As always, it’s easy to identify a problem, it’s harder to suggest a solution. And Steve does not shy away from providing solutions.
  •  http://bardess.com/blog/?p=202 –> Yet another article on what bad data is and what it is costing your organization. What’s interesting about this one is the five key data standards mentioned; Completeness, Accuracy, Timeliness, Uniqueness and Consistency. They match up with what I put forward earlier (Full, Accurate, Consistent & Timely) except I am missing Uniqueness. Why? Well, being a data vendor, I know that we create that single record of the truth (well) and why would you go anywhere else for public data?

Data Quality, what does it mean to you?

Posted in geoLOGIC | Tagged , , , | 1 Comment

Analyze This, Visualize That ……

And  Why Not “Scenarize”, Simulate, and Virtualize
Them, Too?

(Part Two of  Data Deluge, Insights Drought)

“Over  the next 24 months, executives say they will focus on supplementing standard
historical reporting of data with emerging approaches that convert information into
scenarios and simulations that make insights easier to understand and to act
.- MIT Sloan Management Review: Analytics the New Path to Value (2010)


The New Intelligent Enterprise Report Cover (Source: MIT-SMR)

How 3,000 Executives, Managers and Analysts from Around the  Globe Propose to Deal with Data Deluge

In my previous blog on the subject, I  sought to establish, using quoted materials, that an intelligent enterprise is  “… one that uses analytics  in a sophisticated manner to produce actionable insights from the flood of data  available to it, which insights lead to wise decisions”.

Note that  the initial quote for this blog reintroduces a basic finding of the study which forms the foundation  for coping with data deluge:  executives are actively seeking new tools  that will a) facilitate development of insights from data and b) enable them to promptly act on such  insights.

The special report New Intelligent Enterprise from MIT Sloan  Management Review (henceforth, MIT-SMR  in this blog)  is based on responses from a global sample of
3,000 executives, managers and analysts. One very significant finding is that over the two-year period following the time of the study (2010)  executives expect to access improved ways by which “complex insights” are communicated “…so they can quickly absorb the meaning of the data and take action on  it.”

The expected new  tools and approaches that they hope to use for this purpose include the  following [listing format supplied]:

  • “data visualization and process simulation,
  • “… text and voice analytics,
  • “social media analysis, and
  • “other predictive and prescriptive techniques”.

A Quick Note on Visualization

A good and modern  definition of visualization is given in this Wikipedia article:  “ … a tool or a method for interpreting  image data fed into a computer and for generating images from complex  multi-dimensional data sets…”.

Moreover, a cursory  review of   a  number of Wikipedia articles on the subject has yielded various types of  visualization tools and methods . Of direct interest to consumers/users  of oil and gas data are the following:

  • Geovisualization
  • Flow visualization
  • Information visualization
  • Information graphics
  • Data visualization
  •  Scientific visualization

A brief on each one is in order –


Geovisualization or  Geographic Visualization, “….refers to a set of tools and techniques
supporting geospatial data analysis through the use of interactive  visualization
.” Its focus is constructing knowledge rather than “knowledge storage or information


Screenshot of exploratory spatio-temporal analysis tool. (Source: Wikipedia)

  Flow Visualization

This type of visualization tool is mainly employed  in fluid dynamics  which renders “flow patterns visible’.  The  visible (or visualized) flow patterns yield directly usable “qualitative and  quantitative information”.

A model Cessna with helium-filled bubbles showing pathlines of the wingtip vortices. (Source: Wikipedia)


Information Visualization

This is an interdisciplinary approach to  creating visual representations
“… of large-scale  collections of non-numerical information..”. Examples of such information  include “… files and lines of code in software systems, library and bibliographic databases, networks of  relations on the internet, ….” and others of similar typology.

Partial map of the Internet early 2005. (Source: Wikipedia)

Information Graphics

Information graphics (or infographics for short) is the tool to use to make  quick and clear graphic visual “representations  of information, data or knowledge”. As produced the graphics are able to quickly and clearly  depict  “complex information ….  such as in signs, maps, journalism, technical writing, and education.


The Washington Metro subway map. (Source: Wikipedia)

 Data Visualization

The same source declares “Data  visualization is the study of the  visual representation of data, meaning ‘information which has been abstracted in some  schematic form, including attributes or variables for the units of information’”.

A data visualization of Wikipedia as part of the World Wide Web, demonstrating hyperlinks. (Source: Wikipedia)

Scientific Visualization

Our source  says, quoting York  University Psychology Professor Michael L. Friendly, “Scientific visualization  is an interdisciplinary branch of science ‘primarily concerned with the visualization  of three  dimensional phenomena  (architectural, meteorological, medical,, biological, etc.), where the  emphasis is on realistic renderings of volumes, surfaces, illumination sources,  and so forth, perhaps with a dynamic (time) component’”.

A scientific visualization of a large simulation of a Rayleigh–Taylor instability caused by two mixing fluids. (Source: Wikipedia)


= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = ==

At next and final install I will touch on “Scenarizing”  (scenario building/scenario thinking) plus Simulation and Virtualization as the  other tools executives would want to use to quickly gain and act on insights  from masses of data that come their way.

I will also cite  some examples and illustrations of how E & P software packages being used in the  Oil and Gas industry are enabling users to effectively deal with data deluge challenges and opportunities. Particular mention will be made of geoSCOUT and Visage.


Posted in Data Management, database, geoLOGIC, geological data, geology, Oil & Gas, Oil and Gas technology | Tagged , , , , , , , , , , , | 3 Comments

Montney Part 3

(Part of a joint series combining the power of VISAGE visual analytics software with geoLOGIC’s value-added data live from the gDC)

Welcome to the third and final post on the Montney. I decided to do it a bit differently today and show some geoSCOUT results in conjunction with the results and charts provided by VISAGE. So… Let’s get started! Remember just click on an image to see it magnified.

One of the questions I most commonly get asked about the gDC is why you might need access to data outside of geoSCOUT. Since geoSCOUT contains much of the same data (almost identical except for a couple very specific items) many people can’t understand the benefit of connecting to the data itself. I think the best way to explain it would be to give you an example of where you can get some really interesting results by working with the raw data. The first is through VISAGE. This chart shows that the direction a Montney horizontal well is drilled in can really affect production results.

The second way I think looking at this data is interesting is to take VISAGE results and map them. What I will show below is really more illustrative because I can’t really post a map as large as to cover the whole Montney play- but I want to start explaining a little more about what is possible for our users. Because VISAGE openly allows you to query the data free-form you can get to any results you want, including percentiles. And with geoSCOUT’s user database I can work with those results to create a map. Here I am showing the production results in an area color-coded to percentiles, the ninetieth percentile is red and everything 50% and lower is yellow (the other colors are explained in the legend). You can see increased produciton in the North-East corner while the outlying areas are spotted with yellow, low-producing wells.

What matters more? Frac spacing or Frac count? What I have done below is layered results from VISAGE (number of fracs and space between fracs) with large producers to see if I can visualize what more of the high producers line up with- frac space or number of fracs. This first chart shows number of fracs grouped into fives layered on the top 20% of producing wells. My data did go from 1-50 fracs but I found this was the most popular range.

The third map shows frac spacing. What I found is that a large number of the high producing wells line up with the 0-225 m average spacing in this area. Have a look at the VISAGE | News blog to see a chart of normalized produciton sorted by spacing. These kind of maps can be even more effective when you integrate proprietary information, which you can do in either geoSCOUT or VISAGE.

Finally below I include a Visage chart of the top producing companies in the Montney. Remember to click on the chart to get a closer look.




There you go- hopefully we showed you some interesting results and maybe you even learned a little about the tools at your disposal which is in part our goal, to educate and expand your thinking! For more charts visit the VISAGE | News blog (find top producing wells in the Montney among other results…)

Posted in Association Partnerships, Customer Education, Customer Support, Data Management, database, geoLOGIC, mapping, Oil & Gas, oil and gas, Oil and Gas technology, petroleum, Product Development, software, Software Development, Technology | Tagged , , , , , , , , , , ,

Go back to school with the CSPG

September is here, and with it comes the back-to-school frenzy. I don’t know about you, but I always get a little nostalgic when I see kids with their cute outfits and shiny new school supplies filing into the classroom for another year, ready to gain all kinds of knowledge. Sometimes I wish I could go back to those days when September 1st meant so much more, but then I remember pulling all-nighters, studying subjects I had no interest in for hours upon end and having my essays magically erased by the computer (both my best friend and my worst enemy), and I feel very grateful that my school days are behind me.

If fall gets you eager to do a little learning of your own, minus the hours of homework and the nasty exams, you’ll be happy to know that the Canadian Society of Petroleum Geologists’ technical luncheons will be starting up again on September 13th! geoLOGIC is, once again, a proud sponsor of the CSPG Technical Luncheon series, and we are very excited about the always-informative talks lined up for this fall. The first four speakers and their topics include:

  • Professor Bruce Ainsworth: Drumheller Revisited
  • Nick Eyles: Canadian Broadcasting Corporation’s Geologic Journey – World
  • Roger M. Slatt: Recent Advance in Characterization of Shale Resources for Exploration and Production
  • Luis A. Buatois: Applications of Ichnology in Facies Analysis and Sequence Stratigraphy


For more information on the CSPG Technical luncheons, or to buy tickets, go to http://www.cspg.org/events/events-luncheons.cfm. Tickets for these events do sell out, so make sure to buy one early!

In the spirit of going back to school, we hope to see you on September 13th. After all, we can’t let the kids have all the fun this fall!

Posted in Association Partnerships | Tagged , , , , , ,

Montney Part 2: Alberta vs. BC

(Part of a joint series combining the power of VISAGE visual analytics software with geoLOGIC’s value-added data live from the gDC)

One of the ways we have decided to look at Montney production in this series is to compare what has happened in the Alberta Montney play over time to what has happened in BC. Alberta has been slow and steady whereas the BC production has sky-rocketed over the last couple of years. This is to be expected given the growth in horizontal drilling. Here are just some of the stats we have come up with:

  • BC production rates surpassed Alberta for the first time in Feb 2009 and continued to grow to 3.5 times that of Alberta in May 2011
  • However, Alberta is ahead in terms of the total gas produced to date in the Montney, accounting for 69.4% of the 2,456 bcf of cumulative gas produced
  • If both provinces were to sustain the May 2011 rates it would only take until September 2014 for BC’s cumulative production to surpass Alberta’s


  • May 2011 production summary:
    801 horizontal wells producing 1306 mmcf/day (89.6% of production) 
    455 vertical wells producing 87 mmcf/day (6.6% of production)
    266 deviated wells producing 60 mmcf/day (4.1% of production)
    26 crooked wells producing 5 mmcf/day (0.3% of production)


The first Montney wells came on stream in July 1977 in Alberta. The most pronounced jump in production (in Alberta) started in November 1995 growing to 274 mmcf/day a year later. Total Montney production has grown to 1457 mmcf/day in May 2011.

  • Total number of Montney wells that have produced = 2261 (excludes commingled wells)
  • 89% of that production came on stream since Sept 2008
  • First horizontal wells came on stream in Sept 2001 in Alberta
  • 70% of the horizontal wells are in BC
  • 21% of current (May 2011) Montney gas production came on stream in 2011
  • 62% of current (May 2011) Montney gas production came on stream since Jan 2010
  • Biggest growth in Production was in 2010, contributing 635 mmcf/day
  • Total BC Production in May 2011 (884 wells) = 1130 mmcf/day (78%)
  • Total AB Production May 2011 (778 wells) = 327 mmcf/day (22%)

  • Cumulative Montney gas produced to date = 2,456 bcf
  • Cumulative Alberta Montney gas produced to date = 1,704 bcf (69.4% of total)
  • Cumulative BC Montney gas produced to date = 752 bcf (30.6% of total)

Stay tuned over the next week or two for more on the Montney. As always we welcome your feedback and suggestions about what you would like to see in upcoming blogs.


Want to know what happens when the price of gas changes? Is Montney production affected? Have a look at the VISAGE | News blog for one last piece of the puzzle

Posted in Data Management, database, geoLOGIC, geology, Industry Best Practices, oil and gas, Oil Sands, petroleum, Product Development, software, Software Development, Technology

Building Limits

I just spent the weekend building a deck.  As far as decks go it isn’t all that much – a simple “ground level” 10’ x 24’ deck on some concrete deck pier blocks.

Deck Foundation Block

Click for larger image

The foundation isn’t all that much – just those concrete blocks sitting on the ground, but then the deck is for a summer cabin that’s only going to be used on random weekends, so I don’t expect a whole lot of performance from it.  But, as I was building the deck I did think about how foundations really do define the limits to what you can do.


Consider a residential building vs. a commercial building.
A residential building can be a detached house, or a duplex (or fourplex) or an apartment block.  In Calgary, if that apartment block is less than 5 floors it can be wood framed, but if it’s over that you are moving into something that falls into the same requirements as a commercial building.  Now, it goes without saying (but I’ll say it anyways) that if you start with a foundation suitable for a house that foundation will not be made to support an apartment building, never mind an office tower or a warehouse, because it was never designed for that task.  By the same token, a foundation for a “big box” retail outlet will not be appropriate for a duplex or a summer cabin at the lake.  And when you know what is expected of your building, then you can ensure that you design the proper foundation for that building.
The fact of the matter is that in building anything, you need to have a clear idea of what the item you are building will be required to do before you begin planning how you are going to construct it.  And this brings us to geoSCOUT.


The first release of geoSCOUT was in late 1993, but the initial concept designs were first sketched out in the mid ‘80’s with the then brand new Macintosh as the expected platform.  As the PC platform became the defacto business standard, and MS-DOS and Windows became the defacto PC operating system standards, our foundation was modified to reflect the prevailing (and potential) standards.  What this meant was that, on release, geoSCOUT was a fully Microsoft Windows-based application suite.  It also meant that as Microsoft continued to evolve Windows, geoSCOUT was able to keep up, so when Windows became able to support long filenames, geoSCOUT was in a position to do the same immediately.  When new printers became available with Windows drivers, geoSCOUT was able to use them as soon as they were installed at a client site.
Since geoLOGIC had made the commitment to Microsoft Windows, we went all in.  To accommodate the sophisticated mapping functions that geoSCOUT is well known for, we knew that the requirement would be for high performance code.  geoSCOUT was developed initially in Microsoft Visual C 6, and over time upgraded through several versions of c and c++ to today’s Visual Studio 2010 c++ environment.
Since we were already using the Microsoft toolset it only made sense that geoLOGIC sign up with Microsoft’s Partner Program, and over time we were able to attain Gold Certification as an ISV (Independent Software Vendor), a status that we have held for over 4 years.


It’s like I said at the beginning of this post.  If you intend to build a commercial property, you need a strong foundation that will support that development.  With geoSCOUT, the product has evolved – A LOT – over the past 18 years, but the original foundation is still there and it is still strong enough to support the commercial development that our user community expects (demands?) of us.  We still release 3 versions per year.  And each of the versions contains a mix of around 300 new features, enhancements, and fixes.  Of course, our user community submits about 3000 feature requests per year, so we have no shortage of things to develop 🙂  But we have a strong foundation to work from.

Posted in geoLOGIC, History, Product Development, software, Software Development | Tagged ,

Preliminary Build

By know you know that a ton of time and effort has been spent on ripping the guts out of the good old ’69 Mustang. And what have I got for it? The ability to build it up the way I want. If I want to bore out the engine block, I can do it. If I want to put in a killer sound system, I can do it. But, I get ahead of myself. What I really want is a functioning, wheels moving forward vehicle. Not totally stock but not totally tricked out either.
This brings me to the topic of today’s post, the preliminary build. And how can you do a preliminary build for a car? Isn’t a preliminary build just another way of saying “Well, our first try at that turned out to be wrong, so we are doing it again”? Maaaaybe. It is about trying something, seeing how it worked out and then taking your learning’s (or lumps) from it to do it better the next time. Replacing the electrical system in a car is a huge undertaking. Especially if you are a grease monkey, not a wirepuller. But a tremendous amount of knowledge has been gained from tracing every wire from the fuse box out to the end point (radio, light, heater, starter, etc) by

  1. Ensuring that it actually got there.  Sounds simple right? Well, when you are dealing with approx 50 wires of varying colours and shades of colours and you are upside down looking at or under a wheel well to fit it in, it’s not that surprising that one or three get mislaid.
  2. Ensuring that it brought power along.  Ahh, the infamous pinch. This was one of the reasons that I decided to replace that 40 year old + wire. I did not want an old wire that looked okay but was pinched inside to be causing me problems. But in the pursuit of laying down the new wire in that incredibly hard to get at place, it happens. Sometime. And only careful inspection and voltage checking insures that when 12v goes in, 12 v comes out.

So yeah, this is turning into a larger & longer project than I first thought, but now that we have seriously knocked out a dozen or so electrical gremlins out and only have 1 left. I feel pretty good. The car runs very nicely (pre tune-up), it just does not have all of the electrical system working. And if you think that I am going to drive my car without listening to some Beatles, you are mistaken.

Now to discuss ‘PPDM in a Box’ and the work on it’s latest release; v1.2. A lot has been learnt on the internals of the application (both DB & middleware). For example, one item around data management is dealing with when a record was created and when it was last updated. Pretty important stuff when you deal with data that has SOX implications. And when you deal with a data model that has over 1700 tables and you have 4 columns per table to update data to, it leads to a lot of extra time/effort/maintenance. Fortunately, there are things like database triggers that can be implemented once and never touched that will mindlessly do these simple date/user capture tasks over and over again; consistently I might add. But not everyone has good experience with triggers and rightfully so. Without a carefully planned architecture, they can be a maintenance nightmare and the start of a true ‘black box’ where good stuff goes in and ‘most of the time’ better stuff comes out. Anyway, the development team is at a crossroads on what to do with this. We do need to track who or what process created a record and then who or what process was the last to change it and report on it but man, we have a long list of enhancements already…. So it got put on the back burner for now. Or so they thought. I went through and created the database triggers for both Oracle & SQL-Server and implemented them on the development databases. We have been using these databases for the last 2 months for running our code against, inserting wells, adding production, tests, cores, renaming wells, deleting business associates. All stuff that is normally done in PiaB. And I have yet to hear a peep or see a bug report that says one of these background triggers stopped working or caused something to fail.  Do I see a future article on data quality here?  I think so.

Does this fall under the category of preliminary build? I think so. We are a lot smarter now because we know that these triggers can work silently in the background and do. We still have to implement this in a true production release, which means trying to think of ways to stress test it. But I think loading in a couple of billion production rows will do that…

In summation, we are not there yet for either the 69 or PiaB. But we have picked up a few knowledge points along the way. In fact, we figured out that when restore an old car, you can never, never ever ground it enough. That point was brought up AND reinforced by the chief mechanic’s thesis professor (yeah, don’t ask how that conversation came about) who, it turns out restored a car when he was in grad school oh so many years ago. And we learned that sometimes database triggers are a very good thing that saves time & money and adds consistency; they just need some care and feeding and architecting to make them add business value.

Posted in Data Management, geoLOGIC, Product Development, software | Tagged , , , ,

Montney Part 1

 (Part of a joint series combining the power of VISAGE visual analytics software with geoLOGIC’s value-added data live from the gDC)

There will be 2 more Montney blog posts coming up in the next week here at VISAGE and geoLOGIC. We thought it would be interesting to showcase our data in such a way that readers can see the value in accessing production information “hot off the presses” as it were.

Something surprising came up when we ran the data for the Montney; in the last month we saw a 13.4% drop in production! This is the first drop in production ever since the Montney became a major play in Western Canada. We quickly discovered it was due to the McMahon gas processing plant turnaround… the question remains- will this shut-in continue to affect production from the Montney?

Some statistics for you;

  • Montney now accounts for 30% of BC’s gas production


  • In the last 4.5 years, Montney has grown from 2.5% of BC’s Gas production to just over 30%.


  • Of the recent 12.7% drop in BC gas production, Montney accounts for 32% of the total drop.

 (As always you can click on the image for a closer look)


A Closer Look at the Montney

Montney production in BC has dropped 13.4% (151.6 mmcf/day) from May to June. This drop is attributable to only a 3% drop in producing wells (which works out to 23 wells seemingly affected by the plant turn-around). For your interest I have also included the Nymex spot price on this chart to show that Montney growth has continued in spite of lower gas prices.

Interested in learning more about the key players affected by the shut in and what happened to their production? Have a look at the VISAGE BLOG for more insights….




Disclaimer of Analysis Results
It is important to note when doing any of this kind of analysis that the selection of wells be scrutinized for meaningful results. The results presented are intended to fuel your understanding of VISAGE capabililties with gDC data. We would be happy to run analyses for specific well lists that you may have. Please feel free to contact us.



Posted in Uncategorized

geoLOGIC systems Spaghetti Western Thank You!

I must first off apologize for the delay in thanking everyone for their participation in this year’s Spaghetti Western stampede party; I have to confess it has been a very busy summer here at geoLOGIC!

If you were there, you will remember that on July 12th geoLOGIC turned Prince’s Island Park into a sensory delight. As the only Stampede celebration on the Island, we had the fantastic opportunity to work with the renowned River Café to present a Stampede party unlike any other. While the morning began looking as though rain might make an unwelcome appearance, my order to mother nature for sunshine was fulfilled at the very last minute, negating the necessity for the thousands of rain ponchos we brought just in case, but that didn’t stop some of our guests from donning them anyway!

We wanted to do something a little out of the ordinary since our guests are far from it, so in addition to the prime locale, we put together a menu we were sure wouldn’t be seen at any other Stampede party. The spread included spit roasted Spragg farm pulled pork sandwiches, smoked wild sockeye salmon flatbread; black eyed pea, bean and roasted heirloom tomato salad and Poplar Bluff organic potato and arugula salad.

Spaghetti Western River Cafe Catering

We featured some great entertainment courtesy of Jory Kinjo and his band and made sure to pepper in some classic country to keep everyone line-dancing in their country duds. A good mix of classic and modern, country and city, the Spaghetti Western delivered.

With over 1300 guests through the gates, we are very excited that after a two year break the industry still remembers the geoLOGIC Spaghetti Western as a must attend event. We look forward to this celebration all year long as an opportunity to personally thank all those we work with in this industry, and our goal is to create an experience that our guests will not forget.

As our very first year at the River Café location, with a new menu and entertainment, the Spaghetti Western had a little different flavor this year. We have already begun planning for next year’s bash and want to make sure, in keeping with tradition, that the party is everything you hoped it would be. So I want to throw it out there and ask all of you who were there what you thought. What did you like, not like; what would you like to see next year?

On behalf of everyone here at geoLOGIC, I would like to extend a heartfelt thank-you to everyone who came out on July 12th and helped us make this one of our most successful stampede events ever! We really couldn’t have done it without you!

I can’t wait to hear what you thought of the event, because like everything else at geoLOGIC we use your feedback directly to make things work better for you and this is no exception. So please, comment away!

Posted in Corporate Social Responsibility, oil and gas, petroleum, Public relations | Tagged , , , , , , , , , , , , | 8 Comments