Gartner Positions Snowflake as a Challenger in Magic Quadrant

The number of different metrics that enterprise SaaS providers must measure themselves against change and grow by the week. Most of those metrics are generated internally. To balance that mix, it’s crucial that SaaS providers undergo rigorous reviews from reputable and time-honored industry analysts.

For example, Gartner’s 2018 Magic Quadrant for Data Management Solutions for Analytics (DMSA) report* has named Snowflake as a Challenger. We feel this is a reflection of how revolutionary the built-for-the-cloud data warehouse is for our customers.

We believe Snowflake’s improved position on the Ability to Execute and Completeness of Vision axes of Gartner’s Magic Quadrant has been achieved by increasing our business and market presence, and delivering the technology and innovation our customers need.

It’s our assertion that this latest achievement for Snowflake further reveals that our cloud-built data warehouse continues to challenge the legacy providers in the data warehousing industry, and continues to serve the current and future needs of the data-driven enterprise.

Snowflake continues to achieve great things and our internal metrics reveal that. We at Snowflake are dedicated to serving our customers to the best of our abilities, which encompass our number one company value: Put the customer first.

In addition, Snowflake has recently received its largest amount of growth funding of $263 million, advancing our company valuation to $1.5 billion – another indicator from outside of Snowflake, from the venture capital community, that Snowflake continues to enable the data economy.

But there is much more work to do. Many barriers still exist for enterprises to access all of their data, and to share live, secure and governed data between themselves and their business partners. We’re keen to remove these barriers on a global scale with one of our latest features, Snowflake Data Sharing.

In addition, Snowflake Data Sharing enables enterprises to transform their data into a business asset. Snowflake customers can now monetize and easily share data, creating new market opportunities that were previously unforeseen. They can also benefit from data shared with them to enhance their products and services, lead their industries and streamline their operations.

Gartner’s DMSA report is a welcomed evaluation of Snowflake’s continued pursuit to innovative solutions that enterprises need in order to get all the insight from all their data. We are challenging the large, legacy data warehouse vendors but they are not our primary focus. Instead, we’re targeting the evolving needs of our customers, who must be the true winners in all of this.

*Gartner “Magic Quadrant for Data Management Analytics Solutions” by Adam Ronthal, Roxane Edjlali, Rick Greenwald, and Donald Feinberg. February 2, 2018.

Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

 

Dresner ADI Report: Snowflake Recommended by 100% of Customer Survey Respondents

Snowflake has ranked highly among the data warehouse vendors in the quadrants of the 2018 Analytical Data Infrastructure Market Study. Snowflake has achieved the highest or second highest marks across all of the metrics within the study’s two models: Customer Experience (product/technology, sales and service), and Vendor Credibility (product value and ethics).

The ADI study is based on a detailed survey completed by customers from each competing vendor. The customer responses formed the basis for how each vendor ranked a specific data warehouse across the two models.

The first is the Customer Experience Model. Within the survey, customers report their experience with the sales and service staff of their data warehouse vendor, along with their experience with their vendor’s technology. Snowflake had the best combined score across both of these attributes. Download the report to see our placement.

The Vendor Credibility Model illustrates the overall perception of the value customers receive from a product along with their confidence in their chosen data warehouse vendor. Vendors that rated highly on the vertical axis of this quadrant are perceived as highly trustworthy, while a placement to the right shows significant value being driven by the product within the customer organization. Snowflake also ranked high in this model. Download the report to see our placement.

Most importantly, 100 percent of the Snowflake customer respondents said they would recommend Snowflake to others. We’re thankful for the confidence our customers have placed in us. Snowflake is a values-driven company and our most important value is “Put Customers First”. It’s the value that drove us to create the only cloud-built data warehouse so customers could experience an infinitely scalable cloud data warehouse. More recently, it’s the value that led us to develop Instant Elasticity, lowering customer costs by up to 80% in some cases. We believe this report validates our approach and our focus on our customers. Download the 2018 Analytical Data Infrastructure Market Study to learn more.

Try Snowflake for free. Sign up and receive $400 US dollars worth of free usage. You can create a sandbox or launch a production implementation from the same Snowflake environment.

Data is Only Transformative with Transformative Technology

At the recent AWS re:Invent show in Las Vegas, The Cube host, Lisa Martin, had the chance to sit down with Bob Muglia, CEO and President of Snowflake. Bob shared his thoughts on Snowflake’s latest addition to its cloud-built data warehouse, Snowpipe, while looking back at Snowflake’s origins and ahead to its future in order to enable the data-driven enterprise.

What is Snowpipe, and how do customers get started with it?

Muglia: Snowpipe is a way of ingesting data into Snowflake in a streaming, continuous way. You simply drop new data that’s coming into S3 and we ingest it for you automatically. Snowpipe makes it simple to bring the data into your data warehouse on a continuous basis, ensuring that you’re always up-to-date and that your analysts are getting the latest insights and the latest data.

In the five years since you launched, how has the opportunity around cloud data warehousing changed? How has Snowflake evolved to become a leader in this space?

Muglia: If you go back five years, this was a timeframe where NoSQL was all the rage. Everybody was talking about how SQL was passé and something you’re not going to see in the future. Our founders had a different view. They had been working on true relational databases for almost 20 years, and they recognized the power of SQL and relational database technology. But they also saw that customers were experiencing significant limitations with existing technology. They saw in the cloud, and in what Amazon had done, the ability to build an all new database that takes advantage of the full elasticity and power of the cloud to deliver whatever analytics the business requires. However much data you want, however many queries you want to run simultaneously, Snowflake takes what you love about a relational database and allows you to operate in a very different way. Our founders had that vision five years ago and successfully executed on it. The product has worked beyond the dreams of our customers, and that response from our customers is what we get so excited about.

How did you identify what data should even be streamed to Snowpipe?

Muglia: As an example, in entertainment we’re experiencing a data explosion. You have streaming video data, subscription data, billing data, social media data and on and on. None of this is arriving in any sort of regular format. It’s coming as semi-structured data, like JSON or XML. Up until Snowflake came onto the scene with a truly cloud-based solution for data warehousing, everyone was struggling to wrangle all these data sets. Snowpipe lets you bring in multiple data sets, merge them in real-time and get the analytics back to your business in an agile way that’s never been seen before.

How does your partnership with AWS extend Snowflake’s capabilities?

Muglia: People don’t want their data scattered all over the place. With the cloud, with what Amazon’s done and with a product like Snowflake, you can bring all of your data together. That can change the culture of a company and the way people work. All of a sudden, data is not power. Data is available to everyone, and it’s democratized so every person can work with that data and help to bring the business forward. It can really change the dynamics around the way people work.

Tell us little bit about Snowflake’s collaboration with its customers. How are they helping to influence your future?

Muglia: As a company, we run Snowflake on Snowflake. All of our data is in Snowflake, all of our sales data, our financial data, our marketing data, our product support data and our engineering data. Every time a user runs a query, that query is logged in Snowflake and the intrinsics about it are logged. When you have a tool with the power of Snowflake, you can effectively answer any business question in just a matter of minutes. And that’s transformative to the way people work. And to me, that’s what it means to build a data-driven culture: The answers to business questions are inside what customers are doing and are encapsulated in the data.

Try Snowflake for free. Sign up and receive $400 US dollars worth of free usage. You can create a sandbox or launch a production implementation from the same Snowflake environment.

How Snowpipe Streamlines Your Continuous Data Loading and Your Business

For anyone who harbors a love-hate relationship with data loading, it’s time to tip the scales.

We all know data can be difficult to work with. The challenges start with the varying formats and complexity of the data itself. This is especially the case with semi-structured data such as JSON, Avro and XML, and it continues with the significant programming skills needed to extract and process data from multiple sources. Making matters worse, traditional on-premise and cloud data warehouses require batch loading of data (with limitations on the size of data files ingested) and huge manual efforts to run and manage servers.

The results? Poor, slow performance and the inability to extract immediate insights from all your data. Data scientists and analysts are forced to wait days or even weeks before they can use the data to develop accurate models, spot trends and identify opportunities. Consequently, executives don’t get the necessary up-to-minute insights to make real-time decisions with confidence and speed.

Common problems that affect data loading include:

  • Legacy architecture – Tightly coupled storage and compute necessitate contention with queries as data is loading.
  • Stale data – Batch loading prevents organizations from acquiring instant, data-driven insight.
  • Limited data – Lack of support for semi-structured data requires transforming newer data types and defining a schema before loading, which introduces delays.
  • Manageability – Dedicated clusters or warehouses are required to handle the loading of data.
  • High-maintenance – Traditional data warehouse tools result in unnecessary overhead in the form of constant indexing, tuning, sorting and vacuuming.

These obstacles all point to the need for a solution that allows continuous data loading without impacting other workloads, without requiring the management of servers and without crippling the performance of your data warehouse.

Introducing Snowpipe, our continuous, automated and cost-effective service that loads all of your data quickly and efficiently without any manual effort. How does Snowpipe work?

Snowpipe automatically listens for new data as it arrives in your cloud storage environment and continuously loads it into Snowflake. With Snowpipe’s unlimited concurrency, other workloads are never impacted , and you benefit from serverless, continuous loading without ever worrying about provisioning. That’s right. There are no servers to manage and no manual effort is required. Snowpipe makes all this happen automatically.

The direct benefits of Snowpipe’s continuous data loading include:

  • Instant insights – Immediately provide fresh data to all your business users without contention.
  • Cost-effectiveness – Pay only for the per-second compute utilized to load data rather than running a warehouse continuously or by the hour. 
  • Ease-of-use – Point Snowpipe at an S3 bucket from within the Snowflake UI and data will automatically load asynchronously as it arrives.
  • Flexibility – Technical resources can interface directly with the programmatic REST API, using Java and Python SDKs to enable highly customized loading use cases.
  • Zero management – Snowpipe automatically provisions the correct capacity for the data being loaded. No servers or management to worry about.

Snowpipe frees up resources across your organization so you can focus on analyzing your data, not managing it. Snowpipe puts your data on pace with near real-time analytics. At Snowflake, we tip the scales on your love-hate relationship with data so you can cherish your data without reservation.

Read more about the technical aspects of Snowpipe on our engineering blog. For an in-depth look at Snowpipe in action, you can also join us for a live webinar on December 14th.

Try Snowflake for free. Sign up and receive $400 US dollars worth of free usage. You can create a sandbox or launch a production implementation from the same Snowflake environment.

 

5 Must-Have Features for High Concurrency Environments

Whether you’re new to cloud data warehousing or comparing multiple cloud data warehouse technologies, it’s critical to assess whether your data warehouse environment will need to support concurrent processing of any sort. Unless you’re a lone database shop, in all likelihood, the answer is yes you will. Concurrency, or concurrent data processing, is simultaneous access and/or manipulation of the same data. This is not to be confused with parallel processing, which is multiple operations happening at the same time, but not against the same data.

Concurrency can take the form of multiple users interactively exploring and manipulating a particular data set, concurrent applications querying and visualizing the same data set, transactional updates to the data, or even concurrent loading of new data or change data into the data set. And if you thought to yourself, “What? I can concurrently load new data, while supporting queries on the same data set?,” then keep reading.

If you require concurrency, at any capacity — you will want these five cloud data warehouse features:

  • High relational performance across a broad range of data types: Of course, you want high performance–that’s a given. However, the more challenging aspect to plan for is fast queries on a broad range of data types, including semi-structured/JSON data. You don’t want your relational data warehouse to bog down and hold up corporate users because now it must handle non-traditional data from groups like the web team or product engineering. JSON is the new normal.  
  • Automatic and instant warehouse scaling: Let’s say your warehouse can handle a high number of concurrent accesses, but it bogs down during a period of high demand. Now what? This is a very important question to have answered as you compare technologies. Will you have to kick-off users? Will you have to schedule after-hour jobs? Will you have to add nodes? Will this require you to redistribute data? If so, redistributing data takes time and it’s a double-whammy because the existing data in the warehouse has to be unloaded. This is a huge disruption to your users. Instead, you want the ability to load balance across new virtual warehouses (compute engines) and have your cloud data warehouse continue to execute at fast speeds–including loading new data–all against the same data set. The more automatic this load balancing happens, the better.
  • ACID compliance: Right up there with poor performance is inaccurate results or reporting due to inconsistent data or dirty reads. With more people, workgroups, or applications accessing the same data simultaneously, the more pressure there is to maintain data consistency. A data warehouse with ACID compliance ensures consistency and data integrity are validated without having to write scripts or manually managing data integrity and consistency yourself.
  • Multi-statement transaction support: Tied to ACID compliance is multi-statement transaction support. If you have users that nest multiple transactions within a single query, you want a warehouse solution that you can trust will completely execute the transactions with integrity or will automatically rollback transactions should another transaction in the line fail.  
  • Data sharing: Data integrity and freshness should not stop just because you have to share data with an external stakeholder. Traditional approaches require engaging in an ETL process or spending time manually deconstructing, securing and transferring data. Data consumers on the receiving end must also spend time to reverse your steps and reconstruct data. This process is slow and manual, and does not ensure your data consumers are always working with live and up to date data. Data sharing allows you to eliminate all of this effort and ensure access to live data.

The business value for proactively planning for concurrency is that you want to ensure your cloud data warehouse can support your environment, regardless of what it throws at you. Especially during times of sudden, unpredictable, heavy query loads against a common data set.

Try Snowflake for free. Sign up and receive $400 US dollars worth of free usage. You can create a sandbox or launch a production implementation from the same Snowflake environment.

Deliveroo Delivers with Real-time Data

In a field of struggling food delivery startups, one notable success story has emerged from the fray. Termed “the European unicorn” by TechCrunch, Deliveroo is a British startup that offers fast and reliable food delivery service from a premium network of restaurants.

Deliveroo recently raised a $385 million funding round, boasts an estimated $2 billion valuation and is credited with transforming the way people think about food delivery. What is this unicorn doing differently? How has it found success where so many others have failed?

“Data is baked into every aspect of the organization,” Deliveroo’s head of business intelligence, Henry Crawford said. “Having instant access to data reveals which geographic areas are experiencing a shortage of restaurants and a shortage of particular cuisines so we can create these hubs right at the consumer’s doorstep.”

Deliveroo analyzes customer behavior, gains insights into market trends and responds with swift decisions and rapid execution by using data-driven insights. Snowflake makes all of this possible.

“With data coming from a variety of sources, including web traffic, transactions and customer behavior, having a data warehouse built for the cloud provides one repository for a single source of truth,” Henry explains.“The shift to Snowflake’s cloud data warehouse has enabled us to make good on our promise that got Deliveroo started: To connect consumers with great food from great restaurants, wherever you are, and whatever it takes.“

Snowflake also accommodates Deliveroo’s 650% growth in 2016. Such rapid momentum prompted Deliveroo to expand its business intelligence team from two employees to 14. Additional team members triggered the need for more access to the same data but without impacting performance.

Since Snowflake is built for the cloud, an unlimited number of users can access all of an organization’s data from a single repository, which is critical to Deliveroo’s success. There’s no replicating data, shifting queries and other workloads to non-business hours, or queueing users to preserve performance. Instead, Snowflake’s true cloud elasticity means Deliveroo can automatically scale up, down and out (concurrency) to load and analyze data without disruption.

“None of these future plans would be possible without real-time, concurrent access to massive volumes of data,” Henry said.

What’s next for Deliveroo? Using real-time logistics algorithms to increase the number and the speed of deliveries. Deliveroo’s expansion plans also include an “Editions” program—delivery-only kitchens so partner restaurants can expand their footprint without opening brick-and-mortar locations.

Learn more about how Snowflake can accelerate your data storage and analytics initiatives.

Snowflake establishes UK foothold with Thibaut Ceyrolle as VP of Sales

Snowflake’s journey to streamline access to data-driven insight in the cloud age continues to capture the imagination. From the beginning, we knew our built-for-the-cloud data warehouse would solve a widespread industry problem. But competing in the data warehouse industry means we’ve adopted a degree of modesty in the face of gigantic opposition. In light of our growing customer base, and support from innumerable sectors, we can start to be a little less bashful.

Earlier this month we secured another $100 million in funding. This will help expand our current operations and establish new footholds globally. We’re pleased to announce we’ve begun trading in the UK with super-smart new offices in Paddington, London as we increase our global reach.

To meet the demands that come with our ambitions, we have appointed Thibaut Ceyrolle as our Vice President of Sales for EMEA.

Thibaut joins us following a career of nearly two decades at the forefront of disruptive technology. Following a start at Hewlett-Packard in ‘98, he has been an instrumental player in driving the cloud revolution from the outset. He quickly ascended through the ranks of digital transformation proponents DevoTeam and BMC Software, before serving as VP of EMEA Sales at Bazaarvoice. An expert in bringing complex technologies to the wider market, Thibaut brings the drive, experience and international outlook needed to conduct strategic promotion of our service. Thibaut is also experienced in launching new offices and expanding operations to new regions and building a strong corporate culture similar to Snowflake’s.

In early 2015, we offered the first commercial version of Snowflake. The one and only data warehouse built for the cloud. We didn’t set out to improve a flawed legacy architecture, we set out to create something new: a fresh start in and industry filled with legacy products. And we’ve been noticed.

In the beginning, it was mostly early cloud adopters that saw the potential of our technology. Since then, we’ve seen horizontal expansion as every industry has come to see the benefits. Having recently signed our 500th customer, it seems we’re starting to snowball (sorry!).

Snowflake is a service we believe will continue to replace legacy on-premises and cloud systems quickly, quietly and, with Thibaut at the UK helm, ubiquitously. Many would think us overly ambitious given the monolithic competition, but the data warehousing industry has let inefficiencies fester. Snowflake’s a new breed entirely and companies want what we’re offering.

The tech world is fed up with wrestling for access to data and Snowflake is just too good a product to pass up. The challenges associated with handling big data is a limitation of legacy technology, not a fact of life. The Cloud has been around for quite some time, but surprisingly, Snowflake is the only cloud data warehouse solution built from the ground up for the cloud. We’re honoured to be the ones bringing relief to customers moving to and accelerating their business in the cloud.

We’re overjoyed with our recent round of funding, but it is based in hard evidence. The incomparable flexibility and speeds we have shown Snowflake to be capable of: up to 200 times faster for a tenth of the price.

No-one has done what we’re doing for the industry. We expect Thibaut to find a hugely rewarding experience here.

Snowflake Cloud Analytics City Tour

Join Snowflake at The Cloud Analytics City Tour in June to hear from leading cloud analytics and data practitioners. The international tour kicks off in London on June 1 and will visit eight US cities and will center on the theme “Your Data Struggle Ends Now”. The one-day events will bring together the brightest minds in data and analytics to discuss the latest trends, best practices and lessons learned in data warehousing, big data and analytics.

Register here for the London Cloud Analytics City Tour stop before 5 May, 2017 to receive the Early Bird discount. To receive an additional 15% off the Early Bird price, use the registration code: SPECIALBLOG.

Register here for the US Cloud Analytics City Tour stops. For the latest updates on events, speakers, and registration, visit the Cloud Analytics City Tour website.

Snowflake Vision Emerges as Industry Benchmark

Technology research and analysis firm Gigaom has ranked Snowflake as the #1 cloud data warehouse in a recent study. We surpassed enterprise data warehouse products including, Google BigQuery, Teradata, IBM dashDB, HPE Vertica, Microsoft Azure SQL, SAP HANA and Oracle Exadata. Snowflake emerged with a top score of 4.85 out of a possible 5.0. The competition averaged a score of 3.5. The six “disruption vectors” Gigaom used as its key scoring criteria are congruent with what we wanted to achieve back in the summer of 2012, when we started Snowflake.

But long before we wrote the first line of Snowflake code, we asked one another: “What should a data warehouse deliver that no other product has before? How can we enable organizations to make the best, data-driven decisions? And how will the world’s most powerful data warehouse help organizations achieve their existing goals and help reveal their future goals?” We then set out to answer those questions.

We wanted to enable organizations to easily and affordably store all of their data in one location, and make that data accessible to all concurrent users without degrading performance. We also wanted Snowflake to scale infinitely, with ease, and cost effectively so organizations would only pay for the compute and storage they used. And the product had to work with the tools that users already knew and loved. Finally, we wanted a data warehouse that required zero management by our customers – nothing to tweak, no tuning required. These defining qualities aligned with the new world of cloud services, and they are what formed the foundation of Snowflake.

What’s happened since the early days of Snowflake? We got to work, and we stuck to hiring the best engineers the world has to offer. We built Snowflake from the ground up, for the cloud, and incorporated all of these elements as the core of the product. In early 2015, we offered the first commercial version of Snowflake – the one and only data warehouse built for the cloud. Since then, our engineering team has added more and more industry-leading capabilities to Snowflake, leapfrogging the traditional data warehouse vendors.

Along the way, we’ve hired high-calibre teams to execute the sales, marketing and finance functions of the company so our customers and partners get the highest value from working with Snowflake. We also built a great customer support organization, providing the level of service our users love. In more recent times, we’ve expanded operations outside of North America to Europe, with Asia-Pacific and other regions coming online soon. We’ve also added Snowflake On Demand™ – the easiest way to get started with Snowflake by simply signing up on our website with just a credit card. All of these efforts over the past four years have led to Snowflake’s most recent inflection point – being chosen as the number one cloud data warehouse.

What does all this mean? Snowflake’s current and future customers have every opportunity to explore all of their data in ways they never thought possible. They can gain the insight, solve the problems and create the opportunities they simply couldn’t with their previous data platforms. We committed to building the world’s best data warehouse – the only data warehouse built for the cloud. Our customers, our partners and now the industry have indicated we’ve likely achieved what we set out to do back in the summer of 2012. Going forward, we’ll continue to serve our customers and partners with the best technology, the best solutions and the best services available.

Read the full report >

Migrating to the Cloud? Why you should start with your EDW

Many organizations we engage with are seriously considering transforming their business and moving some (or all) of their IT operations into the cloud. A lot of executives I have encountered are struggling with the same question: “How do I get started?” There is a strong case to be made that starting with your Enterprise Data Warehouse (EDW), or at least a data mart, is the fastest, and most risk-free path, with added upside potential to increase revenue and set you up for future growth. As operational data volumes continue to grow at exponential rates, it’s not a matter of if you go to the cloud to manage your enterprise data, but when.

Before going too far on your cloud journey, I would recommend an exercise in segmenting your business from an IT perspective in a very simple way. To get you started, let me suggest five possible categories, along with some risks to consider for each:

  • Customer-facing Applications – This is the heart and soul of your business. If something goes wrong, you lose business and revenue, and people potentially get fired. Risk: HIGH
  • Internal Applications – Mail, Payroll, General Ledger, AP, AR, things like that. Every person inside the organization relies on at least one of these services, and a lot of analysis needs to take place to figure out all the integration points to ensure nothing gets missed during a migration to the cloud. Risk: HIGH
  • Desktop/Laptop OS and Applications – There are whole books and schools of thought about how to migrate these, which means it’s a big decision and a big deal. Impacting everyone in the company on your first cloud initiative? Risk: HIGH
  • Operations Monitoring and Alerting – Got a Network Operation Center (NOC)? These guys are integrated with every system that is important, so moving them to the cloud could be a large undertaking. Risk: HIGH
  • Reporting and Analytics – Hmmm….if my constituents don’t get their weekly or monthly reports on time, is that a disaster? Can they get by with a small outage during the migration? Risk: LOW

Starting with the Data

Let’s take a closer look at why starting your cloud journey with your EDW could be a viable option, and even have some benefits that could help sell the idea (of the cloud) internally. In no particular order, I would highlight these points:

  • Doesn’t disrupt the business – Many EDW implementations are not mission critical today (as compared to enterprise applications). As more data becomes available through social media or Internet of Things (IOT) applications, businesses need access to much larger volumes of data and they will want access to it earlier in the data pipeline. Traditional DWs contain aggregations and are used for doing trend analysis, analyzing data over a period of time to make strategic, rather than tactical decisions. They are not architected to handle this new influx of raw data in a cost-effective manner. By starting your cloud journey with the EDW, you reduce risk (by going to a more flexible architecture) while getting your team early exposure to working with cloud services.
  • Doesn’t disrupt internal users – When moving to the cloud, you want to show incremental success and don’t want to add a lot of unnecessary risk. It’s simple to keep running your existing EDW in parallel with your new cloud DW, giving you a built-in fall-back plan for the early stages. Or you may decide to start with a small data mart as a pilot project.
  • Start-up costs are a fraction of on-premises, appliance solutions – Some of our customers invested as much as $10 million (or more) years ago on a data warehouse appliance that is now outdated technologically. And the renewal costs to keep that tech going are coming due. If they re-invest another huge sum of money, this will delay them getting to the cloud by another 4-5 years, putting them behind their competition. Rather than outlaying a large capital expenditure to extend the life of the older technology, it may make better sense to move to the cloud. The cloud offers a utility-based model, allowing you to pay for what you use and when you use it, as opposed to what you think you are going to need 2-3 years in the future. As a result, not only is the cost of entry lower, but you are not risking a huge sum of money to make the move.
  • Data is growing at an exponential rate – Will you ever have less data to worry about in your business? If you plan on being successful, I don’t think so. Many organizations are looking at new and different ways to manage and analyze ever-increasing volumes of data coming in various formats from multiple sources (such as semi-structured web logs). Your current on-premises EDW was not designed for this kind of workload or data.  If you are considering changing infrastructure platforms to accommodate it, why not select tools that were built for today’s modern data challenges instead of legacy-based architectures? Moving to the cloud also gives you the opportunity to consolidate operations and streamline business processes.
  • Enable new capability – There are some new analytic paradigms happening in the cloud (such as machine learning). Cloud-based platforms allow you to work with both detailed and aggregated data at scales never imaged (see the case study about DoubleDown as an example). Need to run a complex analytic job with a 256-node Massively Parallel Processing (MPP) cluster for an hour, and then shut it down? No problem. Can your platform support a thousand users without concurrency issues?  How would that change your business if it could dynamically adjust to handle those new demands?

As with any infrastructure move, the benefits have to be clear enough that the status quo mentality can be overcome and analysis paralysis doesn’t push out your journey to the cloud for months or even years. The beauty of the cloud model is that it is easy to start small and scale without risking a huge investment up front. Every business needs some proof before committing time and resources to move anything to the cloud and your EDW is a perfect candidate. Snowflake is the first and only EDW built for the cloud to be truly elastic for all of your analytic and big data needs.

Please feel free to reach out to us at info@snowflake.net. We would love to help you on your journey to the cloud. And keep an eye on this blog or follow us on Twitter (@snowflakedb) to keep up with all the news and happenings here at Snowflake Computing.

Looking Back at 2016 Predictions

Last December, I made some predictions for 2016. As we approach the end of the year, I thought it only fair to look back and compare what I predicted to what has happened.

Do or die for big old tech

This was an easy one to get right. Big old enterprise tech companies are hunkering down and watching the world pass them by. HP and Dell are vying to be the king of legacy. There is money in this but who really wants to wear that crown?

IBM is trying to move on with Watson but can Ginni Rometty really pivot that aircraft carrier? And can Watson provide Jeopardy-winning answers for a variety of industries without an army of IBM consultants to spoon feed it? Only time will tell but there is reason to be skeptical.

At Oracle, Larry seems to have discovered the cloud (and will probably soon claim that he invented it). But he remains confused about what a cloud really is. When Oracle talks about Exadata Cloud Service, legacy hardware in a managed services datacenter, they demonstrate they’re still lost in the fog.

Overall, 2016 was not a good year for big old enterprise tech.

Public cloud wins, but who loses?

My prediction on the progress of private clouds was almost an understatement. This year, the move towards private clouds has been slower than molasses on a cold winter day. VMware continues to miss the mark, failing to deliver a cost-effective private cloud solution. And Openstack is a confusing grab bag that requires a huge SI investment, which is beyond the reach of almost all customers.

Meanwhile, almost every company, including most financial services, is now committed to adopting the public cloud. Amazon of course is the big winner but Microsoft has shown once again they will persevere and succeed. Last year, I picked Google as the wildcard. Diane Greene appears to have brought focus to Google and they clearly gained ground in 2016. Google possess the technical capability but they still need to get a lot more serious on the sales side as they have no enterprise experience. A recent query on LinkedIn shows 465 sales openings for Microsoft, 604 sales positions for Amazon, and only 85 open sales roles for Google cloud.  Google can’t compete against Amazon and Microsoft with just 85 more sales people.

The other major public cloud player that has emerged strong in 2016 is Alibaba. China cloud is set to explode in 2017. While it will be tough for Alibaba to gain traction in the US, in China it will almost certainly be the winning player.

All of the other public cloud wannabe’s are in a world of hurt. It looks like we’ll have four public clouds – Amazon, Microsoft, Google and Alibaba.

Spark divorces Hadoop

As I predicted last year, 2016 was not a good year for Hadoop and specifically for Hadoop distribution vendors. Hortonworks is trading at one-third its IPO price and the open source projects are wandering off. IaaS cloud vendors are offering their own implementations of the open source compute engines – Hive, Presto, Impala and Spark. HDFS is legacy in the cloud and is rapidly being replaced by blob storage such as S3. Hadoop demonstrates the perils of being an open source vendor in a cloud-centric world. IaaS vendors incorporate the open source technology and leave the open source service vendor high and dry.

Open source data analysis remains a complicated and confusing world. Wouldn’t it be nice if there were one database that could do it all? Wait, there is one, it’s called Snowflake.

What do Donald Trump and EU bureaucrats have in common?

Looking back at 2016, I guess not much. 2016 is a year that EU bureaucrats would rather forget and The Donald will remember forever.

On the privacy side, we saw some encouraging news with the creation of Privacy Shield. That said,  Privacy Shield is already being challenged and this space remains uncertain. On a purely positive note, Microsoft won the case in Ireland that prevents the US government from grabbing data stored in other countries. The ruling was critical for any U.S. cloud company that has a global footprint.

Perhaps the most encouraging thing from 2016 is that Europe has a full plate given the challenges of Brexit, a Donald Trump-led America, ongoing immigration issues and upcoming elections with strong populist candidates. Given these problems, concerns about privacy are likely to take a back seat so the bureaucrats may be content to stand behind Privacy Shield.

About that wall, Donald hasn’t said too much lately but I think we will see something go up on the border. He loves construction.