There have been numerous posts and tweets coming from the NonStop vendor community following RUG events worldwide: ETBC,...
With Striim, HPE NonStop users can capitalize on CDC and supply data to popular data lakes!
If you check the Striim blog on a regular basis there were two posts that would have likely caught your attention. While many members of the NonStop community have been spending time at user events around the planet, there were many other events equally as important taking place and Striim was an active participant in many of them. There still really isn’t any substitute to being in a place where you can tell your story directly to those most interested in solving business problems and there is none bigger that integrating transaction processing with real time analytics. “Turning Change Data to Time-Sensitive Insights,” is proving to be key when it comes to the ongoing success of Striim among those enterprises where mission critical solutions play an active role in their everyday business lives.
The concept of using Change Data Capture (CDC) to pull information from transactional systems like NonStop is well known among the NonStop community. There have been many products that have utilized this principle in order to pull data from NonStop in a steady stream-like manner. As a result there are numerous replication products available to provide business continuity that have relied upon CDC methodology. However, where the interest in Striim has arisen isn’t because of its application to business continuity requests as it is in Striim’s ability to pull data from NonStop and update data lakes, including the increasingly popular Snowflake.
However, it was the head of Product Marketing at Striim, Irem Radzik, who perhaps captured the benefits of CDC best in the first post on the topic, Log-Based Change Data Capture: the Best Method for CDC. “Change data capture, and in particular log-based change data capture, has become popular in the last two decades as organizations have discovered that sharing real-time transactional data from OLTP databases enables a wide variety of use-cases,” said Radzik. “The fast adoption of cloud solutions requires building real-time data pipelines from in-house databases, in order to ensure the cloud systems are continually up to date. Turning enterprise databases into a streaming source, without the constraints of batch windows, lays the foundation for today’s modern data architectures.”
Should the reference to Snowflake make you more than a little curious then you may want to check out the data sheet Striim has recently developed in support of Striim to Snowflake as it continues to gain popularity with today’s enterprise. Snowflake can run on both AWS and Azure and this provides a simple and easily-supported option to those enterprises with requirements for a cloud neutral data warehouse / data lake. As for Striim, it enables enterprises “to stream enterprise data from on-premises and cloud-based sources to Snowflake in real time, with built-in scalability, security and reliability.” Sound familiar? Again, further to what Radzik posted, when it comes to streaming data in this manner, it’s all about CDC! For more information about Striim support for Snowflake, check out the data sheet published to the Striim web site – https://go2.striim.com/streaming-data-integration-for-snowflake
Of course, none of this could have happened without the CDC capabilities inherent within Striim and this is also the theme of the second post on this topic. “The business transactions captured in relational databases are critical to understanding the state of business operations,” posted Striim Snr VP of Marketing, Katherine Rincon, in her post Change Data Capture Methods. “Change Data Capture provides real-time or near real-time movement of data by moving and processing data continuously as new database events occur. Moving the data continuously, throughout the day, also uses network bandwidth more efficiently.”
More impressive still, wrote Rincon, “Of all of these change data capture methods, the Striim platform uses log-based CDC to continuously extract and move the changed data. Striim also filters, transforms, aggregates, masks, and enriches change data while it is in-motion, allowing it to be delivered to a variety of targets in the format required with sub-second latency. Built-in database delivery validation capabilities compare the sources and targets as transactions are replicated. This tracks key performance metrics for data pipelines and validates source and target database consistency for zero data loss.”
Both of these posts are well worth reading and if as yet you haven’t signed up to be notified about new posts to the Striim blog, you should think of doing so – visiting the Striim blog will result in a prompt to sign up so don’t miss the opportunity to read further updates from Striim as they are published. And isn’t it time you looked more closely at Striim for use on your NonStop system as Striim supports SQL/MX, SQL/MP and Enscribe as source databases. Should you think reading more about CDC and Striim is not enough incentive to visit the Striim blog, then in late-breaking news, read how Striim Sweeps 2019 Best Places to Work Awards.