2018 Dataworks Summit is just around the corner. As you’re preparing your travel to San Jose, it’s time to think about how to maximize your time at the Dataworks Summit. Dataworks Summit will take place from June 18 to June 21. Sessions, keynotes, and workshop are spread across eight different tracks. Check out the full agenda. Everyone may have different goals for this summit. While you’re going through the agenda to select the best sessions for you and your organization, here are our recommendations.
“We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run” - Roy Charles Amara
Mr Amara, an American researcher and futurist, probably didn’t anticipate how much wisdom was encapsulated in just a few words. Buying technology is hard. And Enterprise IT buyers are often left with the hard task of determining if the new piece of technology they just heard about is pure hype or if it has hope. Where are they to find consistent guidance?
It may seem like only yesterday that we said goodbye to 2017, but we are almost half-way through 2018. Big things in Big Data happened in the month of April. Many of us watched Mark Zuckerberg testify in front of Congress about data use and security, and still await the final outcome of Cambridge Analytica’s data abuse. If April seemed like it slipped under your fingers, check out what you might have missed in the world of big data.
My previous blog highlighted some best practices to gain immediate value from all the capabilities that AtScale offers when using Tableau as your BI tool of choice. One of those best practices is to use AtScale-created date dimensions to improve query performance, which is particularly helpful when using date dimensions as filters.
Five years ago we had a hypothesis that Business Intelligence (BI) needed a reboot. We planned to take the best parts of original BI ideas and merge them with modern engineering and data analytics to build a platform for delivering self-serve, secure, curated and fast analysis to the entire business. Our strategy, we believed, would build a bridge from the old world to the new world while giving us a stage to present new concepts that made Business Intelligence truly intelligent.
Organizations have come to the realization that data is a core part of their strategy and a scalable distributed computing platform central to their technology investment. However, a challenge that big data practitioners face is what use case they should first implement in their journey towards realizing their big data strategy. The reality is, multiple items need to be addressed: choosing the right technology, requisite funding, and the right technical talent. However, identifying the right use case with defined success outcome is the most crucial point of starting a big data project.
Often times, customers approach me with questions around AtScale’s ability to integrate into customer’s operational stack. Today, I want to highlight a component of AtScale’s Development Extensions called Webhooks.
A webhook (also called a web callback or HTTP push API) is a way for an application to provide other applications with real-time information. A webhook delivers data to other applications as certain events occur, meaning you get data immediately as opposed to a REST API which you would need to poll for data very frequently in order to get it close to real-time. This makes webhooks much more efficient for both provider and consumer.
March is gone and Spring has arrived, at least for many of us. A lot happened in March, and we certainly don't want you to miss out on what’s big on big data. Without further ado, here is what you might have missed in March.
The joy of working as a Customer Success Solution Architect is that I have the opportunity to work with many different customers and each challenges us with a different Big Data use case.
I've worked with enterprises that offload their Netezza database into the cloud. I've seen companies analyze social media data in real-time. I've helped teams streamline operational processes and increase efficiency in production lines. Big Data provides enterprises a competitive advantage and reduces operational costs across a these varied scenarios. However, setting up a big data environment is not for the faint-hearted - or is it?
If you are like me, a Tableau fan, you’ve probably used Tableau for many years, attended numerous Tableau Conferences, and cheered with great enthusiasm when the engineers at Tableau demonstrated the latest and greatest enhancements to the software. You may also be very accustomed to creating your own calculations based on the row level data you are connected to. You enjoy the freedoms that Tableau offers.