Last week, I had the honor to moderate the OSIsoft 2014 user conference in San Francisco. Over 2000 professionals came together to discuss the value and use of real-time data across different industries. There were a ton of really interesting and inspiring customer presentations. It’s just amazing to see how much companies rely on analytics these days to keep their operations running and/ or to improve their situation.
Combating the Polar Vortex
One of the keynote presentations of the conference really stuck out and I want to share the content with you. Columbia Pipeline Group (CPG) operate close to 16000 miles of natural gas pipelines in the US. Keeping the gas flowing reliably and safely is not easy to begin with. But doing that during the polar vortex that struck the East Cost of the US earlier this year is even harder. CPG turned to real-time data and analytics to keep their assets safe. The benefits of using data are tremendous as outlined in Emily Rawlings’ presentation:
Estimated $ 2.8M in savings from event (outages etc.) prevention
Increased customer confidence
Improved asset reliability
Expanded operational visibility.
If you have a few minutes to spare, take a look at Emily’s cool presentation:
Real-time data is all around us. Modern sensors allow us to capture enormous amounts of data at extremely high frequencies. Here is an example: grid operators nowadays utilize so-called syncrophasors (also called PMUs) to record over 40 different KPIs at 120hz. They use this information to keep our electric supply safe and stable. Shift managers use real-time data to keep production lines running and performing. However, managing this type of data requires a different type of technology. It’s not your typical big data problem. You can’t just stick this high-speed stuff into a simple relational database. That would be like driving around the desert with a Formula 1 car.
Monitoring grid stability in real-time
My new employer OSIsoft has been helping companies capture, archive and analyze real-time data for over 30 years. It’s quite an amazing success story. It all started with a brilliant idea to develop a high performance time-series database (the famous PI system). This has gradually developed into a true infrastructure for managing all kinds of real-time data across different industries. If you want to find out more about this, take a few minutes to watch the recent keynote from our EMEA User Conference 2013 in Paris. If you want to skip my opening words, you can safely forward to minute 10.
No doubt – there is tremendous value in data. I use data collected from a small sensor in my bike to improve my cycling performance. Factories leverage data to keep their machines humming as long and as efficiently as possible. Unfortunately, most companies have historically tried to keep data for themselves. Sharing was a foreign concept. Security concerns and cultural barriers (“It’s my data!”) have fostered this environment.
“Share your knowledge. It is a way to achieve immortality.”― Dalai Lama XIV
What if we could share critical data with relevant stakeholders in a secure and effective way? Would we be able to improve our performance? Take a look at this short video to see what can happen if you start sharing subsets of your data. It is a fascinating scenario.