Almost 5 years ago to the day, my co-founders and I opened the door of our company in downtown San Mateo. It was just the five of us, driven by the desire to make Big Data work. We had been scarred by complex Big Data analytics projects in our former lives and were frustrated that the Business Intelligence market didn’t have a good solution for the data stored in data lakes.
Since then, the AtScale team has grown to almost 100 talented folks. Our company has not only received the trust of great investors (we announced our Series C today), we have also grown into a technology leader that has driven success at many enterprises.
Companies across the world, from American Express to Vivint, use our software across a large variety of use-cases and many, like GlaxoSmithKline, TD Bank and Yellow Pages have received industry recognition for their work.
We are well on our way to achieving our mission and looking back at the last five years, I couldn’t be more proud of the work of my team and the support of our partners and customers. Together, we have set the stage for a major transformation in the BI world and I can’t wait to see what the next five years brings.
2022: Next-Gen Business Intelligence
First, I’ll acknowledge the obvious. AtScale wouldn’t be experiencing such rapid growth if it wasn’t benefiting from a major shift in the Business Intelligence market, driven by 3 key disruptions:
- Self-service BI, the force behind the industry’s 60% YOY growth in 2015 according to Gartner, has shaken the industry and has resulted in several unintended consequences. Business units have bought all the tools they could buy, and now their I.T departments need to bring order to their BI chaos.
- Data Lakes have become ubiquitous. The average Chief Information Officer spends close to $20M a year to orchestrate Big Data analytics workloads across traditional data warehouses like Teradata, modern data platforms like Hadoop and next-generation serverless technologies like Google BigQuery. As a result, the data platform ecosystem has become more complex, not less.
- The Cloud, as highlighted in our last Big Data Maturity survey, will drive 72% of enterprise CIOs to modernize their enterprise data architecture and economically scale their deployments. This will bring great benefits to the enterprise but it will also result in a mixed on-premise and cloud environment which will create more headache for IT and business users alike.
These trends have created both opportunities and challenges for enterprise CIOs. To embrace the future, industry leaders need to let go of past practices and the very notion of an on-premise, enterprise data warehouse.
Many are still grappling with the consequences of these market trends and are wrestling with a “Big Data Gap” that makes it difficult for their users to leverage all their rich enterprise data to innovate, reduce cost and drive revenue.
The "Big Data Gap". Let's get horizontal
The average IT organization has three options to deal with this new enterprise data reality:
- “BI lock-in”. Forcing business users to standardize on one vertically integrated BI stack for all their analytics needs, often fails because it runs counter to business users’ desire for self service. IT invariably ends up supporting multiple “BI stacks”, building data pipelines for each, managing multiple semantic layers while increasing overall budgets and decreasing agility. Even worse, organizations end up with overlapping, competing business metrics embedded in each silo, driving confusion and inconsistency.
- Making “Big Data Small”. The popularity of the data lake has enabled enterprises to capture and store increasingly larger and more granular data sets. However, data lakes don’t work well with BI tools so IT continues to shrink data into data marts backed by the same legacy technologies they sought to replace. The result: more cost, more complexity and a loss of fidelity and granularity. Data marts and the enterprise data warehouse is the enemy of scale and the enemy of AtScale.
- Do Nothing. Continuing traditional data practices is the most common approach. Eventually, the sheer size, scope and velocity of data makes “doing nothing” a job threatening strategy.
As you can read from the many case-studies we’ve published to date, industry leaders have embraced the heterogeneity of their enterprises’ BI environments. The winners have enabled business users to continue to work with the BI tools they love by driving “horizontal data standardization” and embracing data platform and BI tool heterogeneity.
Why a Universal Semantic Layer, Now?
The enterprises we’ve worked with have embraced their users’ needs by creating a framework for democratizing access to data for bigger and better insights for all. AtScale is the glue that simplifies this complex environment.
Our secret sauce is our Universal Semantic Layer. The concept of a semantic layer for data is not new. If you google its definition, you’ll find the following: “A semantic layer is a business representation of corporate data that helps end users access data autonomously using common business terms. A semantic layer maps complex data into familiar business terms such as product, customer, or revenue to offer a unified, consolidated view of data across the organization.”
What is new, however, is the idea that the Business Intelligence platforms of 2022 will require this semantic layer to be Universal, Open, Virtual and Infinitely Scalable.
- Universal & Open. Our customers love that their users can leverage the tools they already know and love to query all data with best-in-class performance, governance and security. We believe that the Business Intelligence platforms of the future will natively work with any BI tool & any data platform, big & small.
- Virtual. Moving data causes all types of trouble. It forces IT to create complicated, costly and potentially insecure data pipelines. It forces business users to wait for stale and incomplete data. It results in multiple, competing metric definitions for multiple tools and use cases. We believe that the Business Intelligence Platforms of the future will maintain business definitions in one place, virtually.
- Infinitely Scalable. Modern data platforms need to scale easily and economically. The data lake architecture is built for scale but doesn’t work well for BI tools. We believe that the standard for Business Intelligence of the future will work well for any BI workloads and will scale for big and small data, whether on-premise or in the cloud.
For further information on this topic, I suggest you read the blog of one of my co-founders, Matt Baird here or take a look at last week’s interview with the Mark Ramsey, Chief Data Officer at GlaxoSmithKline about how his company leverages AtScale.
Thanks for your continued support and thanks for reading!