11/26/2023 0 Comments Apache iceberg snowflake![]() Improved, higher-performance compiler that enables larger and more complex SQL statements with reduced resource requirements.Query plan caching: Useful for when many users simultaneously fire similar queries against the SQL engine as they navigate through dashboards. ![]() Better query planning: Dremio gathers deep statistics about the underlying data to help its query optimiser choose an optimal execution path for any given query.Thirumalesh Reddy, VP of Engineering and Security at Dremio, added his two cents: “There are two major dimensions you can optimise to maximise query performance: processing data faster, and processing less data.” Dremio’s latest software releases does both. Dremio has to provide both so customers see no need for ETL prepping of data warehouses as a necessary part of running their preferred analytics processes and getting fast query responses. Data warehouses, like Snowflake and Yellowbrick Data, have a great deal of functionality and built-in speed. The pitch is that having SQL-based analytics routines run directly on data stored in Amazon S3 and Azure Data Lake means there us no need to pass through an Extract, Transform and Load (ETL) process to load a data warehouse before running analytics processes. Its so-called Dart initiative makes Dremio’s in-memory software, powered by Apache Arrow, run faster and do more to save customers time and money. Unicorn cloud data analytics startup Dremio aims to supply analytic processes running directly on data lakes, and says its latest open source-based software release, adding speed and more powerful capabilities, is a step forward in obsoleting data warehouses and eliminating the data warehouse tax.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |