Skip to main content

New Confluent Cloud capabilities simplify data management

Data streaming company Confluent today announced new features being added to Confluent Cloud geared at ensuring data is trustworthy, easily processed, and securely shared. 

Among these features is an extension of the Stream Governance suite, Data Quality Rules. With this, users can remediate any data quality issues so that their data can be relied on to make business-critical decisions.

Furthermore, the new Custom Connectors, Stream Sharing, the Kora Engine, and early access program for managed Apache Flink are intended to help companies gain insights from their data on a single platform in order to cut back on operational burdens and improve performance.

“Real-time data is the lifeblood of every organization, but it’s extremely challenging to manage data coming from different sources in real time and guarantee that it’s trustworthy,” said Shaun Clowes, chief product officer at Confluent. “As a result, many organizations build a patchwork of solutions plagued with silos and business inefficiencies. Confluent Cloud’s new capabilities fix these issues by providing an easy path to ensuring trusted data can be shared with the right people in the right formats.”

Data Quality Rules also allows for schemas that are stored in Schema Registry to be augmented with multiple types of rules so that teams can improve data integrity, resolve quality issues quickly, and simplify schema evolution.

Custom Connectors also make it so that any Kafka connector can run on Confluent Cloud without the need for infrastructure management. 

With this, teams can connect to any data system through their own Kafka Connect plugins without any code changes, gain high availability and performance through the monitoring of the health of team’s connectors and workers, and decrease the operational burden of managing low-level connector infrastructure.

Lastly, Confluent’s Stream Sharing allows teams to easily share data with enterprise-grade security. With this, users can exchange real-time data with any Kafka client; share and protect their own data with authenticated sharing, access management, and layer encryption controls; and improve the quality and compatibility of shared data with consistent schemas across users, teams, and organizations. 

To learn more, read the blog post

The post New Confluent Cloud capabilities simplify data management appeared first on SD Times.



from SD Times https://ift.tt/KfuF478

Comments

Popular posts from this blog

Difference between Web Designer and Web Developer Neeraj Mishra The Crazy Programmer

Have you ever wondered about the distinctions between web developers’ and web designers’ duties and obligations? You’re not alone! Many people have trouble distinguishing between these two. Although they collaborate to publish new websites on the internet, web developers and web designers play very different roles. To put these job possibilities into perspective, consider the construction of a house. To create a vision for the house, including the visual components, the space planning and layout, the materials, and the overall appearance and sense of the space, you need an architect. That said, to translate an idea into a building, you need construction professionals to take those architectural drawings and put them into practice. Image Source In a similar vein, web development and design work together to create websites. Let’s examine the major responsibilities and distinctions between web developers and web designers. Let’s get going, shall we? What Does a Web Designer Do?

A guide to data integration tools

CData Software is a leader in data access and connectivity solutions. It specializes in the development of data drivers and data access technologies for real-time access to online or on-premise applications, databases and web APIs. The company is focused on bringing data connectivity capabilities natively into tools organizations already use. It also features ETL/ELT solutions, enterprise connectors, and data visualization. Matillion ’s data transformation software empowers customers to extract data from a wide number of sources, load it into their chosen cloud data warehouse (CDW) and transform that data from its siloed source state, into analytics-ready insights – prepared for advanced analytics, machine learning, and artificial intelligence use cases. Only Matillion is purpose-built for Snowflake, Amazon Redshift, Google BigQuery, and Microsoft Azure, enabling businesses to achieve new levels of simplicity, speed, scale, and savings. Trusted by companies of all sizes to meet

2022: The year of hybrid work

Remote work was once considered a luxury to many, but in 2020, it became a necessity for a large portion of the workforce, as the scary and unknown COVID-19 virus sickened and even took the lives of so many people around the world.  Some workers were able to thrive in a remote setting, while others felt isolated and struggled to keep up a balance between their work and home lives. Last year saw the availability of life-saving vaccines, so companies were able to start having the conversation about what to do next. Should they keep everyone remote? Should they go back to working in the office full time? Or should they do something in between? Enter hybrid work, which offers a mix of the two. A Fall 2021 study conducted by Google revealed that over 75% of survey respondents expect hybrid work to become a standard practice within their organization within the next three years.  Thus, two years after the world abruptly shifted to widespread adoption of remote work, we are declaring 20