In a 2014 blog post, Information Week contributing editor Lenny Liebmann looked at the emergence of big data and saw a need for a set of best practices to ensure that data science happens efficiently and reliably.
Liebmann called this discipline “DataOps,” or “the set of best practices that improve coordination between data science and operations.
Seven years later, DataOps is a “top trend,” according to VentureBeat, and the race is on to sharpen the definition and claim a piece of the market.
There are many definitions of DataOps, but a recent Innotech piece lays one out particularly well: “DataOps orchestrates the use of data across an organization to continuously uncover value.”
Here’s how other experts view DataOps:
- According to Gartner, “DataOps is a collaborative data management practice focused on improving the communication, integration, and automation of data flows between data managers and data consumers across an organization.”
- A team of Forrester analysts, including Michele Goetz, vice president and principal analyst, define DataOps as “the ability to enable solutions, develop data products, and activate data for business value across all technology tiers from infrastructure to experience.”
DataOps may sound like a no-brainer for big tech companies that were born to embrace it. But heavy industry, on the other hand, has had a slower approach to the data goal line.
It’s fair to say that in 2021, DataOps is finally going mainstream in the industrial world, bringing the data mindset to companies in manufacturing, energy, and power and utilities, to name a few. For these industries to compete in the age of connectivity, they must learn how to get value from the unprecedented amounts of data available to them. And the pressure to do so has never been higher.