Source – techtarget.com
Advances in computing power, the sheer volume of data that is now available online and improved artificial intelligence algorithms have finally made AI practical. But how should you implement artificial intelligence data storage?
There is no one-size-fits-all answer for artificial intelligence data storage. Every AI application is different and so is the data that is associated with the application. As such, there are a number of different questions that you must consider when planning AI data storage.
What is the nature of the source data?
AI applications are dependent on source data; you must know where the source data resides and how the application uses it.
Suppose that a particular AI application is designed to make decisions based on the input received from a collection of industrial internet of things sensors. You must know whether or not the application treats the sensor data as transient. Can the application analyze the sensor data in near-real time as it arrives from the sensors, or does the application need to store the data and then analyze it?
How much data will the AI application generate?
An equally important consideration for artificial intelligence data storage is the volume of data that the application will produce. AI applications produce data of their own; they generally analyze the source data and then write the results of the analysis to a back-end database that the application’s decision tree can use. It would not be practical for an AI application to parse multiple terabytes or even petabytes of data every time the software must make a decision. It is far more practical for the application to query a database of information that has already been parsed.
How will you use the AI application?
You must consider how many people will use the application at a given moment and how quickly the application will need to deliver information to users.
Consider Cortana, Microsoft’s AI-based personal digital assistant for Windows. Vast numbers of people could use Cortana simultaneously. Cortana accepts verbal input and responds verbally to questions, which means it requires an extremely high-performing storage back end. On the other hand, a lightweight AI-based business application that half a dozen people use might not require more than a single SSD. You must build a back-end storage system that meets the application’s expected I/O requirements.