Every once in awhile, the ultimate question comes up: "What is the best analysis tool for BI on Hadoop?!". AtScale is not in the business of favoring one tool versus the other. We are in the business of making all of them work. There are indeed many reasons why business users and I.T. departments choose particular analysis tools. Here are a few things to consider.
Conducting exploratory data analysis or even basic business intelligence on Hadoop often requires input from data scientists who:
- Create models for related information across Hadoop.
- Structure databases to contain that data.
- Ensure different BI tools generate consistent results when referencing identical data elements.
How many data scientists you'll have to hire depends on the size of your organization and the composition of the data under its control. In this post we review how much it typically costs to hire a data scientist, the factors you'll have to consider when assembling a team and ways you can alleviate the workload placed on data scientists.
In the world of Business Intelligence and Big Data there continue to be a number of exciting innovations as new and improved options for processing large data sets appear on the market. You may be familiar with AtScale’s BI-on-Hadoop Benchmarks - where we focus on evaluating the top SQL-on-Hadoop engines and their fitness to support traditional BI-style queries. As we continue to work with customers who are navigating their journey to BI on Big Data, we are increasingly getting questions about the emerging cloud-based data processing engines.
In this blog post, we will take a deeper look at Google’s BigQuery, and how it stacks up in the BI-on-Big Data ecosystem.
CONTINUING OUR TRACK RECORD OF RAPID DELIVERY & INNOVATION
Today we announced the general availability of AtScale 5.0 and I couldn’t be more excited about the host of great new features that are included in this release. As we’ve continued to gain traction in a number of industries - ranging from healthcare to retail to financial services to telco to online- we continue to learn from our customers and use these learnings to feed directly back into our product features. With the release of 5.0, AtScale customers now have an even richer set of capabilities that they can use to derive business insights and value from their Big Data investments. I’ve included some of the highlights of the release in the sections below.
I’ve asked it before and I’ll ask it again. Wouldn’t it be great if you could easily analyze ALL your data from a Excel single file? We all know this isn’t feasible; especially when dealing with big data and complex business analytics needs.
In working at the intersection of Big Data and traditional Business Intelligence, the AtScale team has encountered a number of complex business analytics use cases that are difficult, if not near-impossible, to solve using typical table-based data models and SQL. Today, I’m going to share why and how complex analysis, like for multi-level metrics, is no longer as ‘difficult’ nor ‘near-impossible’ as it once was.
Yesterday, Gartner published the 2017 Magic Quadrant for Business Intelligence. The MQ research for BI has been in existence for close to a decade; it is THE document of reference for buyers of Business Intelligence technology.
Wouldn’t it be great if you could load all of your data from a single file into an Excel pivot table for easy analysis?
Unfortunately, this approach isn’t usually viable when dealing with complex business analytics and big data. Take for example a typical use case found inthe world of healthcare insurance. A large insurance provider has 10s of millions of members, and processes 100s of millions of claims a year. As flexible as Excel is, we all know it won’t handle this volume or velocity of data.
As a result, more and more enterprises store large data sets in big data platforms like Hadoop. And while Hadoop provides a low-cost and performant approach to store and process this information, there is still the challenge of supporting the many types of analytics required on claims and member data sets. But why? Why and how, with all of the advances in technology, can a simple calculation cause so much complexity?
Yes, there are actually ways to 'Do Big Data Analytics Right'.
Leaders and innovators in the Big Data space have learned the hard way, and now those of you looking to dip your toe, or jump head first, into the BI on Big Data waters can capitalize on their early experiences. Let go of the fear or ego or whatever may be holding you back and take the chance to learn from those who took the early adopter risk.
I am a software engineer and I like abstractions. I like abstractions because done correctly an abstraction will factor complexity down to a level where I don’t have to spend any brain cycles thinking about it. Abstraction lets me work with a well thought out interface designed to let me accomplish more without having to always consider the system at a molecular level.
It turns out business people also like abstraction. This shouldn’t be surprising as businesses model complex real world concepts where the details matter. From calculation to contextual meaning, abstraction helps with correctness and understanding.
Topics: Semantic Layer
With every New Year come new trends. As we experience changes across the world of Analytics and Big Data, we took a look at sources we follow and trust and put together a Top Five 2017 BI and Big Data Trends to Watch.
- Big Data, no longer just Hadoop -- Tableau
- Consider the Cloud -- Tech Republic
- Query performance Matters --Information Management
- Business-driven Apps drive Value for Data Lake -- MAPR
- Big Data becoming part of Business Fabric -- Datanami
Read on to hear why, how and where BI and Big Data leaders in the space say enterprises are, or should be, focusing, to stay ahead of the curve and succeed with BI and Big Data.
Topics: 2017 Big Data TRends