In the previous post we demonstrated how to model percentile estimates and use them in Tableau without moving large amounts of data. You may ask, "how accurate are the results and how much load is placed on the cluster?". In this post we discuss the accuracy and scaling properties of the AtScale percentile estimation algorithm.
In the previous post, we discussed typical use cases for percentiles and the advantages of percentile estimates. In this post, we illustrate how to model percentile estimates with AtScale and use them from Tableau.
A new and powerful method of computing percentile estimates on Big Data is now available to you! By combining the well known t-Digest algorithm with AtScale’s semantic layer and smart aggregation features AtScale addresses gaps in both the Business Intelligence and Big Data landscapes. Most BI tools have features to compute and display various percentiles (i.e. medians, interquartile ranges, etc), but they move data for processing which dramatically limits the size of the analysis. The Hadoop-based SQL engines (Hive, Impala, Spark) can compute approximate percentiles on large datasets, however these expensive calculations are not aggregated and reused to answer similar queries. AtScale offers robust percentile estimates that work with AtScale’s semantic layer and aggregate tables to provide fast, accurate, and reusable percentile estimates.
In this three-part blog series we discuss the benefits of percentile estimates and how to compute them in a Big Data environment. Subscribe today to learn the best practices of percentile estimation on Big Data and more. Let's dive right in!
Did you know that 9 out of 10 companies are only able to analyze less than half of the data they collect? In fact, 45% of these companies analyze less than one quarter of this data . Why is that? The answer is simple. We are generating more data than ever, 44 zettabytes by the year 2020 to be precise. What does this mean for you? A great data lake filled with extremely valuable potential to make better decisions for your company. Remember what Sherlock Holmes once said: “I can’t build bricks without clay!” We need data to answer complex questions.
President’s day is the perfect opportunity to explore, honor and remember the legacy of Washington, Lincoln and other presidents. It is also a great day for all those who are looking for a good sale at their favorite retail or online store. What does this mean? Hundreds of millions of sales transactions will generate enormous amount of financial and inventory data will be generated on this one day.
Data, Data, Data! All the facts, numbers, and everything in between that, when collected, can be analyzed and decisions made based upon them. Can’t live with it! Can’t live without it! This year over $14 billion will be spent globally on flowers, chocolate, and jewelry on Valentine’s Day alone. According to CIO Research, big data analytics investments are expected to be close to $187 billion. We are definitely investing more on data than expensive chocolates! The big question is, are your data investments nurturing your relationship with data the same way a bouquet of flowers and a box of chocolates can nurture your relationships?
The AtScale Enterprise Data Visionary Award program recognizes the leading enterprises that have reached exceptional productivity levels by modernizing their Big Data Analytics environment. Customers were evaluated based on ingenuity, innovation, business value and impact of their AtScale implementation.
We want to congratulate the 2018 AtScale Visionary Award Winners! Read on to learn more about how these industry leaders used AtScale to become trend-setters in the Big Data world.
With the recent GDPR policy, security with data is becoming a bigger concern for enterprise. In this post, you will learn about True Delegation and how it helps enterprises achieve their data security and governance requirements. You will also learn about the challenges, the Hadoop authorization landscape, and the unique solution that AtScale provides.
A version of this article originally appeared on the Cloudera VISION blog.
One of my favorite parts of my role is that I get to spend time with customers and prospects, learning what’s important to them as they move to a modern data architecture. Lately, a consistent set of six themes has emerged during these discussions. The themes span industries, use cases and geographies, and I’ve come to think of them as the key principles underlying an enterprise data architecture.
Whether you’re responsible for data, systems, analysis, strategy or results, you can use these principles to help you navigate the fast-paced modern world of data and decisions. Think of them as the foundation for data architecture that will allow your business to run at an optimized level today, and into the future.
Keeping resolutions is hard. Research shows that most of us fail to follow through on our new years' resolutions by the second week of February! We are hopeful that 2018 will be different though! In this contributed piece, Donald Farmer takes us through his best practices for marking and keeping resolutions. Donald is highly respective figure in the Data Analytics world and has built outstanding product franchises at Qliktech and Microsoft. He is currently Principal at TreeHive Strategy, an I.T. advisory firm.