Energy company deploys edge devices to improve drilling efficiency

Moving compute and storage resources to edge locations can reduce latency and bandwidth needs, improve performance and save money. At the same time, widespread  edge computing  deployments can introduce significant management challenges. Servers can be hard enough to maintain when they’re in an on-prem data center. What if they’re deployed in the middle of nowhere?

Energy companies know all too well the challenges of remote computing.

“When we drill a well, it’s always in the middle of nowhere,” says Dingzhou Cao, senior advisor for data science at independent shale producer  Devon Energy, a Fortune 500 company based in Oklahoma City, Okla.

Sending a massive stream of data back to a central location isn’t always feasible, but the company still wants to know what happens at its sites. “We always have a bad connection to the Internet,” he says. “Ninety percent of the time it’s available–but 10 percent of the time, it’s not.”

Ninety-percent availability is acceptable when the data is for monitoring purposes only, but if a real-time response is required, it’s a problem. In particular, the company is looking to improve drilling and operations efficiency and automate tasks with machine learning. For that, on-site team members need immediate data analysis.

“We want to move everything to the edge, so that even if we lose the connection, the guy at the site can still see what’s going on and make a decision,” Cao says. “If you rely on cloud computing and lose the connection, then the field guy can’t see the results.”

To make it happen, the company faced the challenge of putting, in effect, small  data centers  out in these far-flung locations.

Read full article at Network World.