The final quarter of HPE’s financial year 2018 has just come to a close and the press announcements and analyst reports are...
Striim delivers end-to-end data streaming integration platform
NonStop users can leverage it today using CDC techniques they already use!
Two newsworthy commentaries that should be of interest to the NonStop community were posted recently to the Striim blog and both featured Striim’s CTO Steve Wilkes. In the first post referenced here, SiliconANGLE’s theCUBE Interviews Striim CTO on Company’s End-to-End Streaming Integration Platform published March 16, 2018, we read of Striim delivering a true end-to-end data streaming integration platform. In the second post referenced in this article, the first in a series of tutorials, Tutorial: Real-Time Database Integration with Apache Kafka via Change Data Capture published just a few days later on March 22, 2018, it is again Striim’sWilkes, who takes us through the process of building data flows for streaming integration and analytics applications using the Striim platform.
For the NonStop community, the story here centers on the use of Change Data Capture (CDC) techniques by Striim, similar to what many members of the NonStop community may recall first hearing about from GoldenGate Software. Back when GoldenGate first broke onto the market with its data replication solution, CDC was relatively new, but today, it is accepted as the least intrusive and yes, proven, manner by which data can be captured as it is created – in real time! The implication here is that in Striim, NonStop data, whether held in SQL or Enscribe, can be treated as a source by Striim and ingested into its data streaming integration platform – again, in real time!
The first post highlights Striim CTO, Steve Wilkes, when he was a featured guest on SiliconANGLE’s theCUBE. Steve sat down with Dave Vellante, Co-Founder & Co-CEO of SiliconANGLE Media, Inc., to discuss Striim’s end-to-end platform:
“One of the highlights from the discussion stemmed from an observation Dave made about companies claiming to be an end-to-end solution, but inevitably relying on a third party for part of their offering. Dave asked Steve to convince him that Striim is truly end-to-end, which launched into a great overview of the platform and how the company can meet all the requirements of collecting, processing, analyzing, and visualizing data in real time to properly label itself as an end-to-end solution.”
If you follow the link in the above post to the Silicon Angle web site, you will come across more of what Striim’s Wilkes has to say in the update, Is this the perfect recipe for data streaming, integration and analysis? Including:
“When you are thinking about doing streaming integration, it’s more than just moving data around,” said Steve Wilkes, founder and chief technical officer of Striim Inc. “[You] can’t just give people streaming data; [you] need to give them the ability to process that data, analyze it, visualize it, play with it, and really truly understand the data.”
“The first part of being able to do streaming data integration or analytics is that you need to be able to collect the data,” Wilkes said. The Striim platform has wizards to help build data flows and create streams SQL-based processing for filtering, transformation, aggregation, and enrichment of data, as well as a cache component to load reference data into memory. “So you take the data stream, you build another data flow that is doing some aggregation of windows, maybe some complex event processing, and then use that dashboard builder to build a dashboard to visualize all of that,” Wilkes concluded. When you visit these sites then make sure you follow the link to the complete video of Wilke’s appearance on theCUBE!
In the second post, the first tutorial of a series, Real-Time Database Integration with Apache Kafka via Change Data Capture, Wilkes covers familiar ground:
“The first step in any streaming integration is sourcing data. You are probably aware that Striim can continuously collect data from many sources. In this tutorial we are going to be using change data capture (CDC) to stream database DML activity (inserts, updates and deletes) from a MySQL database. The same process would be used to read from Oracle, SQL Server, and MariaDB, but MySQL was chosen so you can easily try this at home.”
It is exactly the same process too as would be used to read from NS SQL and NS Enscribe. NonStop users should have no problem leveraging the databases files and tables that already reside on NonStop and indeed, a number of NonStop users have already elected to leverage Striim in this manner including HPE’s own IT organization. But one observation by Wilkes, tucked away in the body of the tutorial, may prove of interest to some NonStop users wanting to know more about how to take baby-steps towards introducing Striim into their NonStop environment –
“Note: If you are using Striim in the cloud, and have configured an on-premise agent to collect data to send to the cloud, you will need to click on ‘configure an on-premise source.’”
Yes, Striim makes the ideal end-to-end data streaming integration platform for deployment from within a cloud – an opportunity for the NonStop users to demonstrate hybrid IT deployment in a practical manner while integrating data from NonStop with analytics enterprises will surely value highly. A case of win-win on multiple fronts and a big opportunity to brighten the spotlight shining directly on NonStop! If you would like to know even more about Striim and its data streaming integration platform you can contact the Striim team at any time. We look forward to hearing from you!