Limited Time Offer!

For Less Than the Cost of a Starbucks Coffee, Access All DevOpsSchool Videos on YouTube Unlimitedly.
Master DevOps, SRE, DevSecOps Skills!

Enroll Now

Why it’s important to operationalize big data into daily tasks

Source: techrepublic.com

Big data analytics is no longer a nice thing to have for enterprises: It’s now mission-critical.

In 2019, Veritas said, “In just a few years, big data has advanced from scattered experimental projects to achieve mission-critical status in digital enterprises, and its importance is increasing. According to IDC, by 2020, organizations able to analyze all relevant data and deliver actionable information will earn $430 billion more than their less analytically oriented peers. Big-data analytics, once performed on an occasional basis, are now performed daily at many enterprises, including, Amazon, Walmart, and UPS.” 

Yet organizations continue to experience difficulty in trying to operationalize it. 

Gartner defines big data operationalization as, “the application and maintenance of predictive and prescriptive models. Both clients and vendors are placing an emphasis on the importance of moving data science out of a prototype environment and into a state of production and continuous improvement.” 

In other words, to operationalize big data, you have to move it out of the test sandbox and into an active role in the business.

The most active roles for big data in the business to date have been in decision support. 

  • Consumer buying patterns from web-based data inform retailers about which products are moving fastest, who is buying them, and where they are being bought.
  • Diagnostic analytics systems enhanced by machine learning inform medical practitioners about the most likely diagnoses and treatments for certain conditions.
  • Sensors placed along tram tracks and on key pieces of equipment inform cities which areas in their physical tram systems require immediate or near-term repair so the system will not fail.

All of these examples illustrate a first tier of big data analytics deployment in that they use unstructured big data and their role is in providing static reports to managers that can be acted upon.

Using analytics in daily workflow

However, when you fully operationalize analytics, there is also a second-tier active stage of engagement in which companies embed big data analytics directly into the daily workflows of their operations. In these instances, the analytics continue to inform decisions but they also automate certain tasks in company workflows based upon the intelligence they glean from data.

A great example of system automation in operations is decision-making in bank lending. For many years, software programs assessed a loan applicant’s credit worthiness and determined a “lend” or “don’t lend” decision and a loan rate that took into account the loan applicant’s credit status, the size of the loan, and the amount of risk. 

The lending supervisor still has the final say, but in essence the lending software has made the decision.

We can extend this model into the area of maintaining a city tram system.

Internet of Things (IoT) Sensors are attached to key pieces of track and equipment. The sensors can detect signs of failure in these physical components before failure occurs. Data is collected, and reports are generated for supervisors, who then organize preventive maintenance tasks and routes.

Now, what if these analytics could be operationalized even further? For instance, an analytics system picks up big data in real time from IoT sensors dispersed throughout the city’s transit system. The system analyzes this data and produces maintenance reports for supervisors—but it also interfaces with a work order planning system that organizes maintenance work by location and sequences work orders for crews.

These work orders could be dispatched directly to maintenance crews or the organization could choose to have a human supervisor review and then authorize the work orders before issuance.

By integrating big data analytics into day-to-day task loads that go beyond just reporting (i.e. tier-two operationalization), organizations can achieve greater returns from their analytics and big data investments.

This is more important than ever because just last year Venturebeat reported that 87% of data science projects still never make it into production.

Going forward, we can’t afford this level of failure for big data and analytics. Operationalizing it in business workflows as well as in static reports is all the more vital. 

Related Posts

Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x
Artificial Intelligence