Limited Time Offer!

For Less Than the Cost of a Starbucks Coffee, Access All DevOpsSchool Videos on YouTube Unlimitedly.
Master DevOps, SRE, DevSecOps Skills!

Enroll Now

WHEN ARTIFICIAL INTELLIGENCE MEETS BIG DATA

Source: analyticsinsight.net

“Gone are the days of data engineers manually copying data around again and again, delivering datasets weeks after a data scientist requests it”-these are Steven Mih’s words about the revolution that artificial intelligence is bringing about, in the scary world of big data.

By the time the term “big data” was coined, data had already accumulated massively with no means of handling it properly. In 1880, the US Census Bureau estimated that it would take eight years to process the data it received in that year’s census. The government body also predicted that it would take more than 10 years to process the data it would receive in the following decade. Fortunately, in 1881, Herman Hollerith created the Hollerith Tabulating Machine, inspired by a train conductor’s punch card. Although an operator still had to manually feed data through the machine’s “counters”, it was exponentially faster than manual counting. In 1943, the British invented a machine to crack Nazi codes- the Colossus. It scanned 5 characters in a second, and reduced the workload from weeks to mere hours. These inventions opened the world’s eyes to the manifold benefits of automating the handling of data.

This is exactly what artificial intelligence is meant to do-perform tasks more efficiently by mimicking our abilities to learn and solve problems. As technology advances at breakneck speed, benchmarks that previously defined AI are becoming outdated by the day recognizing text through optical character recognition (OCR), which was once considered a feat, is now taken for granted as a basic computer function. At the same time, data is growing bigger fuelled by its democratization and the IoT environment. While no technology ever has or will be a magic bullet for industry at large, the leaps and bounds that these two worlds are making has led to a synergistic relationship between them-AI is useless without data and data is insurmountable without AI.

Imagine that you’re a company that wants to identify which products are working best for which customers, in order to improve your marketing decisions. You would probably first look at your financial reporting systems and see which products are selling the most, but this doesn’t give you any information about who is buying these products. So, you go take a look at your CRM or Customer Relationship Management systems, which is disconnected from the financials. Here, you understand the types of customers buying them. You still need to know which marketing campaigns are driving these specific individuals to buy your products. This is possible manually too, but in another department. After a lot of copying and pasting and formula-driven excel spreadsheets, you arrive at an answer. Now, you’re exhausted-having gone from department to department to collect information that otherwise seems connected. Also, a precious company resource-time is running out.

By opening up access to all the departments’ data and using artificial intelligence in the form of machine learning, your work will become exponentially easier and less time consuming. Companies today have recognised this, and are rapidly deploying AI-related technologies.

However, as is the case with any new technology, implementation can be difficult and ultimately more costly than anticipated. Lack of capabilities to support AI is a common problem that companies face, because most of them acquire their AI capabilities from external sources rather than develop them. This approach can spell disaster in the face of a technology that’s evolving so rapidly. RockMass Technologies, a lab-based startup overcame this by establishing trust among data mining companies that welcomed them to be a part of their testing processes. A company based in South Africa, ThoroughTec tackled another common challenge-lack of high quality data, by running algorithms on test data in parallel with human observations while developing their product. They used the results to improve their algorithms and slowly scaled back human involvement. Quicker insights gleaned from data, and more efficient business operations are massive improvements being seen, just as the combination of AI and big data is beginning to reveal its possibilities. This calls for companies, by learning through examples from trailblazers like ThoroughTec, RockMass and many more, to evolve and overcome their AI-related problems (possibly by using AI).

Related Posts

Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x
Artificial Intelligence