Limited Time Offer!

For Less Than the Cost of a Starbucks Coffee, Access All DevOpsSchool Videos on YouTube Unlimitedly.
Master DevOps, SRE, DevSecOps Skills!

Enroll Now

Hazelcast Simplifies Application Modernization with Event Driven Architectures

Source: adtmag.com

The latest release of Hazelcast’s Jet event stream processing engine for AI and ML deployments of mission-critical applications adds new app development features designed to simplify the integration of an event-driven architecture into brownfield deployments to gain new functionality around real-time and in-memory processing.

“Stream processing” is about processing data in motion–users take action on data at the time it is created. It typically involves multiple tasks performed on an incoming series of data, and it can be performed serially, in parallel, or both. The stream processing pipeline starts with the generation of the data, followed by the processing of the data, and finally the delivery of the data to its destination.

With the release of Hazelcast 4.2, the in-memory computing platform maker
is making it possible to add Jet’s extensibility to existing applications through real-time caching and stateful microservices.

The list of updates and enhancements in this release includes new support for streaming integration with MySQL and PostgreSQL databases using a unified high-level API. Traditional RDBMS-based applications require hundreds or thousands of lines of code to add functionality, as well as significant testing. With the new API in Jet 4.1, the integration becomes more of a declarative task to reduce the custom error-prone code. 

Additionally, Hazelcast Jet makes the database available as a stream. It deals with connectivity, object mapping and unifies the event handling across database vendors. The series of database updates form an event stream on which developers can more easily add microservices without impacting existing applications. This simplification lets developers focus on adding new business logic in high-performance applications rather than managing complex and error-prone integrations. 

The transactional data from MySQL or PostgreSQL can be augmented and enriched with other datasets from Hadoop, Amazon S3, Google Cloud Storage, Azure Data Lake, and others, the company says, and served through thousands of concurrent low-latency queries and fine-grained, key-based access.

“With these enhancements, enterprises can take advantage of in-memory speeds to accelerate analytical queries to scale your architecture by offloading certain workloads from your transactional database into an in-memory store,” the company said in a statement.

Hazelcast provided support in Jet earlier this year for change data capture (CDC) via the open-source Debezium project. In version 4.2 the CDC integration has been optimized to reduce the manual coding required to utilize this capability, the company says.

Also, over the last year, the library of connectors for Hazelcast Jet has been expanded to include Apache Beam, Confluent, MongoDB, JDBC, Apache Cassandra, and others. Version 4.2 includes connectors for Elasticsearch and Apache Pulsar. By connecting Jet to Elasticsearch, the company says, enterprises can rapidly enrich large data sets, including those from relational databases, and transform them into formats suitable for indexing and search-based analysis by Elasticsearch.

Related Posts

Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x
Artificial Intelligence