Skip to main content

Open Architecture Is the Innovation Differentiator

Strategically, it’s vital that businesses aren’t subjected to long-term vendor lock-in. Choosing a lakehouse architected with open standards and open formats eliminates that issue. People over 30 are cautious about not getting back into scenarios where, once a vendor has all your data, they turn the screws on you with maintenance bills and other contracted costs. What motivates the current crew and next generation of architects and engineers is that they know open architecture gives them the ability to use a wide range of new services and apps. That’s fundamentally better because it powers faster innovation in an increasingly cloud-native world.

Architecting with open standards and formats envisions a world where, to use a travel analogy, you don’t have to worry about adapters and converters to plug into power and services. At its best, open is about stripping away costs and complexity and getting everyone on the same page so they can innovate unimpeded. More businesses than ever leverage day-to-day transactions that have many concurrent users employing different engines and services for a wide range of purposes against the same data. It’s not easy to accommodate that efficiently with proprietary architectures. Frankly, arguments against open architecture are becoming passé. The same arguments used to be levied against the cloud itself and pretty much every technical innovation in the last 50 years.

Everyone likes to throw around the term “open” these days, so it’s important to closely consider version differences, community momentum, actual level of access, and thought leaders’ perspectives—while giving everything a good test run to experience how striking those differences really are.

Everyone Wants an Elegant Open Table Format—But the Metastore is Key

Recently, at Subsurface 2022, a significant number of major players vying for attention in the lakehouse space gave talks about support for Apache Iceberg, a popular, community-built table format for data lakes. Iceberg is an open-source project that is key to unlocking value with lakehouses because it makes any data lake files workable through table formats, without dealing with the risks of vendor lock-in.

But to bring genuine ease of use to lakehouses, an intelligent metastore for Iceberg is essential, with functions far beyond what a traditional metastore, like Hive, offers. Those functions, found in a free implementation like Arctic, a hosted version of the open source Nessie project, include automatic data optimization for Iceberg tables (e.g., compacting small files into larger ones, garbage collection, and repartitioning), reproducibility to train AI models with just a couple commands, referential integrity in joins, and logging of all changes to all tables (data and metadata) for better data governance. 

Additionally and perhaps most importantly for users is offering a GitHub-like experience for data in the metastore. By bringing branches directly to the independent data tier (i.e., any data lake), users can sandbox experiments, test datasets, and merge successful tests into a main branch, without creating unmanaged copies of data. That supports the way people think about data and want to work with it in the real world—in multiple sessions with multiple users leveraging clean versioning, just as they do with application code. Arctic offers this innovation while working across all query engines, including Sonar, Flink, Presto and Spark. That is and should be the expectation for any lakehouse: to work with data as code. 

Embracing Paradigm Shifts Is Non-Negotiable 

Open lakehouse architecture signals the direction of a much larger data paradigm shift. Major innovation is always criticized as a fairytale, counter to running a business efficiently. Vendors unprepared for the future will protest: “But you’ve got a business to consider, and I can get you up and running in a day. What do you care more about anyway, your business or saying you have an open architecture? Are you Apple? Do you have 6000 PhD engineers working for you?” Of course, arguments like this present a false dichotomy. 

Consider the major paradigm shifts of the past several decades. With the mainframe to client-server shift, we heard old mainframers at that time criticize the upstart relational databases as toys, unreliable and full of bugs, with terrible performance compared to the mainframe. The advent of web apps on the internet suffered similar criticism—dot coms are built with such immature technology, posing so many security risks! The web ecosystem won’t support real work the way meaty client-server applications do. Then, along came mobile. Its critics initially cited differences with the rich, web browser capabilities on a desktop. And, of course, the shift from on-premises, monolithic client-server designs to API-connected microservices across cloud, hybrid and distributed ecosystems is in full swing, but was met with all the same criticisms. 

The truth is that no new paradigms are adopted wholesale and overnight. Use-case experimentation is always the starting point. Enterprises don’t turn off their current systems. They start building or adding adaptations where they make the most sense. Nobody should feel this is an either/or proposition, but everyone should feel the urgency to, at a minimum, understand the approaching paradigm shift to open data infrastructure models, like open lakehouses.

While all paradigm shifts are hard initially, the yield isn’t just a replacement technology; it’s a different experience offering different capabilities. Company leaders—CEOs, CTOs, CIOs and Boards—are tasked with putting their fingers on the pulse of the future to identify where the trends are moving. Leaders only focused on where the puck is today, not where it will be in 1, 2, 5 and 10 years, will lose their market position or never gain one in the first place.

The post Open Architecture Is the Innovation Differentiator appeared first on SD Times.



from SD Times https://ift.tt/AoquhGc

Comments

Popular posts from this blog

Difference between Web Designer and Web Developer Neeraj Mishra The Crazy Programmer

Have you ever wondered about the distinctions between web developers’ and web designers’ duties and obligations? You’re not alone! Many people have trouble distinguishing between these two. Although they collaborate to publish new websites on the internet, web developers and web designers play very different roles. To put these job possibilities into perspective, consider the construction of a house. To create a vision for the house, including the visual components, the space planning and layout, the materials, and the overall appearance and sense of the space, you need an architect. That said, to translate an idea into a building, you need construction professionals to take those architectural drawings and put them into practice. Image Source In a similar vein, web development and design work together to create websites. Let’s examine the major responsibilities and distinctions between web developers and web designers. Let’s get going, shall we? What Does a Web Designer Do?

A guide to data integration tools

CData Software is a leader in data access and connectivity solutions. It specializes in the development of data drivers and data access technologies for real-time access to online or on-premise applications, databases and web APIs. The company is focused on bringing data connectivity capabilities natively into tools organizations already use. It also features ETL/ELT solutions, enterprise connectors, and data visualization. Matillion ’s data transformation software empowers customers to extract data from a wide number of sources, load it into their chosen cloud data warehouse (CDW) and transform that data from its siloed source state, into analytics-ready insights – prepared for advanced analytics, machine learning, and artificial intelligence use cases. Only Matillion is purpose-built for Snowflake, Amazon Redshift, Google BigQuery, and Microsoft Azure, enabling businesses to achieve new levels of simplicity, speed, scale, and savings. Trusted by companies of all sizes to meet

2022: The year of hybrid work

Remote work was once considered a luxury to many, but in 2020, it became a necessity for a large portion of the workforce, as the scary and unknown COVID-19 virus sickened and even took the lives of so many people around the world.  Some workers were able to thrive in a remote setting, while others felt isolated and struggled to keep up a balance between their work and home lives. Last year saw the availability of life-saving vaccines, so companies were able to start having the conversation about what to do next. Should they keep everyone remote? Should they go back to working in the office full time? Or should they do something in between? Enter hybrid work, which offers a mix of the two. A Fall 2021 study conducted by Google revealed that over 75% of survey respondents expect hybrid work to become a standard practice within their organization within the next three years.  Thus, two years after the world abruptly shifted to widespread adoption of remote work, we are declaring 20