Snowflake expands EU footprint with Dublin deployment

Snowflake continues the global expansion of its built-for-the-cloud data warehouse with its second AWS deployment in Europe. Snowflake’s European customers, and global customers with operations in the EU, can now choose to run their workloads from our Frankfurt deployment and/or from our newly added Dublin region. Dublin represents Snowflake’s fifth global deployment, following instances in US West (Oregon), US East (Northern Virginia), EU (Frankfurt) and Asia Pacific (Sydney). Having a presence in multiple AWS regions gives customers more flexibility to choose a deployment that best fits their needs. Some key factors to consider include:

  • Cost: Based on AWS prices, Snowflake’s compute and storage costs vary by region. Availability in multiple European regions allows Snowflake to offer organizations competitive pricing. Refer to our product pricing guide for more information.
  • Compliance: Customers with compliance and data sovereignty requirements may choose one region over another. Different countries may mandate their data to be stored in a particular region. Snowflake does not move data between accounts and/or regions, unless the customer chooses to do so. This enables customers to accelerate their global data initiatives and satisfy their country-specific data needs.
  • Latency: Having the option to choose between multiple regions is an important factor for customers who need the lowest latency from their Snowflake data warehouse-as-a-service. Since Snowflake offers multiple deployment options across the EU, North America and Asia, customers can select the region with closer geographical proximity to their end users.
  • Data Egress: Another factor to consider is data egress costs. Cloud infrastructure providers such as AWS charge egress fees to customers who move or copy their data between multiple regions. Although data egress does not impact most of our customers, customers may wish to choose a region that minimizes cross-region data egress. Refer to the data egress section of our product pricing guide for more details.

Snowflake’s number one company value is to put our customers first. Our Dublin deployment is another example of our customer-centric strategy, allowing you to choose the option that best fits your application and workload needs. For more information about how Snowflake plans to serve our customers’ global data needs, click here to read a blog from our CEO Bob Muglia.

Disruption as a Stepping Stone

It’s one thing to disrupt an entire industry with a revolutionary technology or business model. It’s something else to disrupt an industry as a step towards your ultimate goal. That’s what we’re doing at Snowflake – disrupting the data warehouse industry on the way to enabling the data economy.

Snowflake’s founders first gathered in 2012 with a vision to enable limitless insight from enormous amounts of varying data through a cost-effective, powerful and secure data warehouse built for the cloud. That vision was realized in 2015, when Snowflake’s cloud-built data warehouse became commercially available. Since then, Snowflake has grown exponentially thanks to our unique and market-leading architecture that far surpasses what competitors offer via their legacy products – both on-premises solutions and those ported to the cloud.

But that was just the beginning. Snowflake is built to be much more than the disrupter of data warehousing and data analytics. It’s an enabler of something even more important: the data economy, where data is the source of business value. With the data economy, enterprises of all sizes, across all industries and across the globe will generate business value by sharing and consuming each other’s live data in an easy, powerful, cost-effective and secure way.

Today, Snowflake announced new growth funding of $263 million – more than double than all previous funding Snowflake has received. This latest funding will help Snowflake meet the ever-increasing demand by enterprises of Snowflake’s cloud-built data warehouse.

The funding will help expand our R&D initiatives by growing our engineering team at both our Silicon Valley headquarters and Snowflake’s newest engineering office in Bellevue, Washington. It will also enable us to advance the revolutionary cloud architecture of our data warehouse so our customers will gain even more insight from data to streamline their businesses, better serve their customers and lead their industries.

Snowflake will also use the funding to advance one of our latest innovations – Snowflake Data Sharing, which we also refer to as The Data Sharehouse™. Until Snowflake Data Sharing emerged, enterprises were forced to use risky, costly and labor-intensive methods to share only slices of stale data. With Snowflake Data Sharing, enterprises can provide governed and secure access to live data within minutes to each other. This has far reaching implications across large enterprises, for organizations sharing data within their ecosystem of business partners, and beyond, to commercialize and monetize data in the emerging data economy.

Where to from here? As enterprises realize the limitless possibilities of modern, cloud data sharing, the data economy will grow. To help fuel this growth, Snowflake will continue our focus on removing the barriers to data access no matter where that data resides – within the enterprise, between enterprises or in a multitude of locations and computing platforms that span the globe. Our vision, and our goal, is to enable any organization to become a data-driven leader in the new data economy.

The Virtual Private Snowflake Story

Snowflake was built as a secure, multi-tenant SaaS data warehouse. We’ve been offering a multi-tenant product designed for high-security needs for years now. But we knew from previous discussions with Capital One and other financial services companies that a dedicated solution would be required to meet their regulatory requirements.

The input we received from the financial services industry gave rise to our idea for a new product: Virtual Private Snowflake (VPS). Those deep discussions with industry customers helped shape and define the fundamental requirements of VPS:

  • Certifiably Secure with PCI support, validated by the customer’s security team.
  • The customer is in complete Control of the data with comprehensive Auditability. Customer-managed keys are a must, as is a complete record of all operations performed by the data warehouse.
  • Resilience to failures with a roadmap to cross-region business continuity. The long-term goal is a 15-minute recovery time from a total regional failure.
  • Isolation that delivers a dedicated instance of Snowflake, running in a separate Virtual Private Cloud.

Snowflake and our financial services customers have worked together since that time to further define the product requirements. Today, I’m proud to announce that our journey to building the most secure cloud data warehouse has reached its first milestone. Virtual Private Snowflake is now commercially available, with Capital One as our first customer.

VPS delivers the full power of the Snowflake service in the form of a fully managed, dedicated pod running in Amazon Web Services (AWS). Financial Services, healthcare and companies from many other industries that handle sensitive data get the best of all worlds with VPS. A dedicated instance of the best cloud data warehouse.

There are many milestones that are still ahead for the most secure, flexible and powerful cloud data warehouse available. But helping customers succeed with their data projects is what we do at Snowflake. That commitment is true for all of our customers.

We thank Capital One and our other financial services customers for their support of VPS. And we look forward to working with all of our customers to help them solve their toughest data challenges.

To learn more about VPS and about Snowflake’s approach to securing sensitive data, click here.

What makes Snowflake a data warehouse?

One of the most common questions I get when speaking to people about Snowflake is: “Why do you call it a data warehouse and not a database?” This is a very reasonable question given some of the characteristics of Snowflake.

At Snowflake, in part, we say we are a full relational database management system (RDBMS) built for the cloud. We are ACID compliant and we support standard SQL. Sounds like a database to me, too. Let’s take a closer look just to be sure.

What is a database?

A database is a collection of information organized to be easily accessed, managed and updated. While there are many types of databases available today, the most common is an RDBMS. But when most folks say “database”, they usually mean a traditional RDBMS that handles Online Transaction Processing (OLTP).

So, what are some of the defining characteristics of an OLTP database?

  • Designed for rapid storage and retrieval of small sets of current data records in support of transactions and interactions within an enterprise.
  • Data is organized in tables and columns, allowing users access via structured query language (SQL).
  • Handles quick, real-time activity such as entering a customer name, recording a sale and recording all accounting activity of that sale.
  • Works well for basic operational reporting of a limited number of records. Analytic reporting is relegated to simple, static reports often driven by IT.

What is a data warehouse?

Some of the defining characteristics of a data warehouse are:

  • A database designed to store and process large volumes of current and historical data collected from multiple sources inside and outside the enterprise for deep analysis.
  • Organizes data into tables and columns, and allows users access via SQL.
  • Optimized for loading, integrating and analyzing very large amounts of data.
  • Designed to support descriptive, diagnostic, predictive and prescriptive analytic workloads.

Snowflake definitely includes the overlapping characteristics of both a database and a data warehouse-ACID compliant, support for standard SQL, etc. But Snowflake also embodies all of the defining characteristics of a data warehouse.

One of the key differentiators of Snowflake, from other solutions, is that it’s specifically designed for data warehousing and high speed analytic processing. Rather than a generalized SQL database that has been “tuned” or even adapted to handle these type of workloads, Snowflake was built from the ground up for the cloud to optimize loading, processing and query performance for very large volumes of data. Therefore, hands down, Snowflake is a data warehouse.

So, why do we still need a specialized data warehouse engine?

As OLTP databases have been able to scale higher and innovations like in-memory databases have emerged, some organizations have questioned whether they still need a separate technology or specialized system for reporting and analytics. The answer, again, requires us to look at the basics: What benefits emerge from storing and analyzing data in a separate system?

  1. It eases the burden of reporting from transactional systems by removing the contention for limited and expensive resources.
  2. It produces more business-friendly data results by allowing the data to be restructured to a more suitable format.
  3. It provides access to a wider array of reports more quickly because all the resources in the data warehouse are dedicated to reporting and analysis.
  4. It integrates valuable data from across the enterprise for richer insight. Something that can’t (and shouldn’t) be done in an OLTP system.

For more information on how you can up your data warehousing game with a modern, built-for-the-cloud approach, check out some of our free resources such as our ebook The Data Warehouse: The Engine That Drives Analytics. We would love to help you on your journey to the cloud so keep an eye on this blog or follow us on Twitter (@snowflakedb and @kentgraziano) to keep up with all the news and happenings here at Snowflake.

Snowflake establishes UK foothold with Thibaut Ceyrolle as VP of Sales

Snowflake’s journey to streamline access to data-driven insight in the cloud age continues to capture the imagination. From the beginning, we knew our built-for-the-cloud data warehouse would solve a widespread industry problem. But competing in the data warehouse industry means we’ve adopted a degree of modesty in the face of gigantic opposition. In light of our growing customer base, and support from innumerable sectors, we can start to be a little less bashful.

Earlier this month we secured another $100 million in funding. This will help expand our current operations and establish new footholds globally. We’re pleased to announce we’ve begun trading in the UK with super-smart new offices in Paddington, London as we increase our global reach.

To meet the demands that come with our ambitions, we have appointed Thibaut Ceyrolle as our Vice President of Sales for EMEA.

Thibaut joins us following a career of nearly two decades at the forefront of disruptive technology. Following a start at Hewlett-Packard in ‘98, he has been an instrumental player in driving the cloud revolution from the outset. He quickly ascended through the ranks of digital transformation proponents DevoTeam and BMC Software, before serving as VP of EMEA Sales at Bazaarvoice. An expert in bringing complex technologies to the wider market, Thibaut brings the drive, experience and international outlook needed to conduct strategic promotion of our service. Thibaut is also experienced in launching new offices and expanding operations to new regions and building a strong corporate culture similar to Snowflake’s.

In early 2015, we offered the first commercial version of Snowflake. The one and only data warehouse built for the cloud. We didn’t set out to improve a flawed legacy architecture, we set out to create something new: a fresh start in and industry filled with legacy products. And we’ve been noticed.

In the beginning, it was mostly early cloud adopters that saw the potential of our technology. Since then, we’ve seen horizontal expansion as every industry has come to see the benefits. Having recently signed our 500th customer, it seems we’re starting to snowball (sorry!).

Snowflake is a service we believe will continue to replace legacy on-premises and cloud systems quickly, quietly and, with Thibaut at the UK helm, ubiquitously. Many would think us overly ambitious given the monolithic competition, but the data warehousing industry has let inefficiencies fester. Snowflake’s a new breed entirely and companies want what we’re offering.

The tech world is fed up with wrestling for access to data and Snowflake is just too good a product to pass up. The challenges associated with handling big data is a limitation of legacy technology, not a fact of life. The Cloud has been around for quite some time, but surprisingly, Snowflake is the only cloud data warehouse solution built from the ground up for the cloud. We’re honoured to be the ones bringing relief to customers moving to and accelerating their business in the cloud.

We’re overjoyed with our recent round of funding, but it is based in hard evidence. The incomparable flexibility and speeds we have shown Snowflake to be capable of: up to 200 times faster for a tenth of the price.

No-one has done what we’re doing for the industry. We expect Thibaut to find a hugely rewarding experience here.

Snowflake Cloud Analytics City Tour

Join Snowflake at The Cloud Analytics City Tour in June to hear from leading cloud analytics and data practitioners. The international tour kicks off in London on June 1 and will visit eight US cities and will center on the theme “Your Data Struggle Ends Now”. The one-day events will bring together the brightest minds in data and analytics to discuss the latest trends, best practices and lessons learned in data warehousing, big data and analytics.

Register here for the London Cloud Analytics City Tour stop before 5 May, 2017 to receive the Early Bird discount. To receive an additional 15% off the Early Bird price, use the registration code: SPECIALBLOG.

Register here for the US Cloud Analytics City Tour stops. For the latest updates on events, speakers, and registration, visit the Cloud Analytics City Tour website.

Growing Snowflake to help you end your struggle for data

In business, your impact directly correlates with the problems you strive to solve. Snowflake’s founders gathered in 2012 to unravel a prevalent and persistent struggle that has hindered organizations ever since OLTP systems emerged five decades ago. Data professionals continue to ask, “How can we derive timely, business insight from all the data available to our enterprise?”

Their struggle is bigger today than it has ever been. The exponential increase in available data in the past 10 years has outpaced the improvements in legacy data warehouse solutions for on-premises and cloud environments. It’s also the reason why Snowflake emerged, and why our customers, technology partners and funding partners continue to invest in us.

With that said, I’m proud to announce Snowflake’s latest round of $100 million of growth funding led by ICONIQ Capital and accompanied by Madrona Venture Group. Snowflake’s Series D round also includes all of Snowflake’s existing funding partners: Altimeter Capital, Redpoint Ventures, Sutter Hill Ventures and Wing Ventures. Since its founding in 2012, Snowflake has raised a total of $205 million in funding.

The cloud has enabled enterprises of all types and sizes access to unlimited data. But therein also lies the struggle: Nearly every enterprise has many on-premises and cloud-based data silos they struggle to bring together. How do you integrate all of those varying forms of data and make all of that data available for deep analysis by all your users without degrading performance? For many customers, it’s difficult to even find the relevant data.

Snowflake is designed to make the struggle for data disappear. Our revolutionary, built-for-the-cloud architecture and proven technology have delivered up to 200 times faster performance for our growing customer base, and at one-tenth the cost of their previous data warehouse solutions. We’ve also enabled customers to launch Snowflake within just a few months, and in many instances, just a few weeks. But all of that, and more, are just the beginning.

With this latest funding round, Snowflake continues our pursuit to help transform every organization into a data-driven enterprise. We are outpacing all of our competitors by enabling customers to do things with data they had never envisioned. Snowflake is solving problems that Oracle, Teradata, Hadoop, and other cloud data warehouses can’t even begin to deliver: The performance, concurrency and simplicity needed to analyze all your data, from one location, by all your users.

Our new funding will also enable a host of initiatives to expand Snowflake’s operations to serve our growing customer base in the US and around the globe. Our UK office is expanding rapidly, with plans to establish offices across Europe and in Asia Pacific. We’ve also expanded our engineering operations with an office in Bellevue, Washington, allowing Snowflake to tap into the best and brightest talent of the Seattle area.

Snowflake is helping enterprises to end their struggle for data. Our rapidly growing base of customers and partners confirms this. And our latest round of growth funding further validates, and enables, Snowflake’s pursuit of the data-driven enterprise. So, be on the lookout, Snowflake is coming your way. And we want to help you end your struggle for data.

Looking Back at 2016 Predictions

Last December, I made some predictions for 2016. As we approach the end of the year, I thought it only fair to look back and compare what I predicted to what has happened.

Do or die for big old tech

This was an easy one to get right. Big old enterprise tech companies are hunkering down and watching the world pass them by. HP and Dell are vying to be the king of legacy. There is money in this but who really wants to wear that crown?

IBM is trying to move on with Watson but can Ginni Rometty really pivot that aircraft carrier? And can Watson provide Jeopardy-winning answers for a variety of industries without an army of IBM consultants to spoon feed it? Only time will tell but there is reason to be skeptical.

At Oracle, Larry seems to have discovered the cloud (and will probably soon claim that he invented it). But he remains confused about what a cloud really is. When Oracle talks about Exadata Cloud Service, legacy hardware in a managed services datacenter, they demonstrate they’re still lost in the fog.

Overall, 2016 was not a good year for big old enterprise tech.

Public cloud wins, but who loses?

My prediction on the progress of private clouds was almost an understatement. This year, the move towards private clouds has been slower than molasses on a cold winter day. VMware continues to miss the mark, failing to deliver a cost-effective private cloud solution. And Openstack is a confusing grab bag that requires a huge SI investment, which is beyond the reach of almost all customers.

Meanwhile, almost every company, including most financial services, is now committed to adopting the public cloud. Amazon of course is the big winner but Microsoft has shown once again they will persevere and succeed. Last year, I picked Google as the wildcard. Diane Greene appears to have brought focus to Google and they clearly gained ground in 2016. Google possess the technical capability but they still need to get a lot more serious on the sales side as they have no enterprise experience. A recent query on LinkedIn shows 465 sales openings for Microsoft, 604 sales positions for Amazon, and only 85 open sales roles for Google cloud.  Google can’t compete against Amazon and Microsoft with just 85 more sales people.

The other major public cloud player that has emerged strong in 2016 is Alibaba. China cloud is set to explode in 2017. While it will be tough for Alibaba to gain traction in the US, in China it will almost certainly be the winning player.

All of the other public cloud wannabe’s are in a world of hurt. It looks like we’ll have four public clouds – Amazon, Microsoft, Google and Alibaba.

Spark divorces Hadoop

As I predicted last year, 2016 was not a good year for Hadoop and specifically for Hadoop distribution vendors. Hortonworks is trading at one-third its IPO price and the open source projects are wandering off. IaaS cloud vendors are offering their own implementations of the open source compute engines – Hive, Presto, Impala and Spark. HDFS is legacy in the cloud and is rapidly being replaced by blob storage such as S3. Hadoop demonstrates the perils of being an open source vendor in a cloud-centric world. IaaS vendors incorporate the open source technology and leave the open source service vendor high and dry.

Open source data analysis remains a complicated and confusing world. Wouldn’t it be nice if there were one database that could do it all? Wait, there is one, it’s called Snowflake.

What do Donald Trump and EU bureaucrats have in common?

Looking back at 2016, I guess not much. 2016 is a year that EU bureaucrats would rather forget and The Donald will remember forever.

On the privacy side, we saw some encouraging news with the creation of Privacy Shield. That said,  Privacy Shield is already being challenged and this space remains uncertain. On a purely positive note, Microsoft won the case in Ireland that prevents the US government from grabbing data stored in other countries. The ruling was critical for any U.S. cloud company that has a global footprint.

Perhaps the most encouraging thing from 2016 is that Europe has a full plate given the challenges of Brexit, a Donald Trump-led America, ongoing immigration issues and upcoming elections with strong populist candidates. Given these problems, concerns about privacy are likely to take a back seat so the bureaucrats may be content to stand behind Privacy Shield.

About that wall, Donald hasn’t said too much lately but I think we will see something go up on the border. He loves construction.

The True Value of Cloud Data Storage Continues to Emerge

We’re in interesting times. Like most significant trends, the data-driven economy revealed a powerful approach that was unique but always in plain sight. We listened and watched closely as experts across industries and different roles promulgated the benefits of capturing, storing and using data from every corner of cyberspace. And not far behind came a related and more interesting topic of connecting the offline world to capture previously unimagined amounts of data, ranging from kitchen appliances to jet engines. This we now know to be the Internet of Things (IoT).

We all acknowledged this data shift would change how companies do business and how we live our lives. As with all significant themes, comes additional thought on the ‘how’. Once we capture all of this data, how will we manage it? How will we effectively store and access petabytes of data, and more, so we can put that data to work?

These aren’t questions just for governments of the largest countries or for global enterprises. All organizations, from the garage start-up to mid-size companies are keen to harness the insight derived from more and more data. As wonderful as this seems, it all comes down to technology and cost. The cost of storing that data, and the technology to easily derive insight from data. But how does an organization accomplish this within their financial limits?

Our founders placed this at the heart of Snowflake. Before they typed the first line of code that ultimately brought the Snowflake cloud data warehouse to life, they wanted to enable data without limits. Snowflake’s built-for-the-cloud architecture truly separates compute from storage, allowing customers to easily scale either resource up and down. This also means Snowflake customers can focus their efforts on the highest value of data warehousing – compute. This is just one of many strategic advances, along with our unmatched technology, that makes Snowflake the most powerful and affordable data warehouse for all of an organization’s data warehousing and analytics.

With that said, Snowflake lowered its storage pricing in October to match Amazon’s S3 storage price. Today, Snowflake again lowered its price to match Amazon’s latest S3 price reduction. This strategy is a crucial component to truly realizing a data-driven world for all – data without limits. The amount of data the world creates continues to increase at an exponential rate. And to harness the insight from that data, organizations need the best technology at the best price. Snowflake has always been there and always will be.

To read more about our latest pricing announcement, click here.

The most powerful and easy-to-use data warehouse is now the most affordable

When I joined Snowflake nearly three years ago, I knew the seasoned engineers who envisioned and developed the product were on to something big. I’ve spent nearly my entire career in the database business but hadn’t seen anything comparable to Snowflake – a built-for-the-cloud data warehouse that’s powerful and easy to use for analyzing all of your data.

Snowflake has now taken another leap forward for our customers. We’ve launched three key initiatives to enable organizations to easily and affordably store and analyze all of their data in one location: Snowflake. No more data silos, data lakes or duplicate systems to manage the volume, variety, velocity and cost of today’s data.

Our first initiative upends the traditional cost structure that has made the data warehouse a precious, limited resource in the past. We’re making Snowflake the most affordable data warehouse available by removing the cost of data storage as a barrier to bringing together all data in one place. Taking advantage of our unique architecture, we’ve lowered our storage price to as low as $30/TB/month, which represents a 75 percent storage cost savings to our customers. The most powerful and easy-to-use data warehouse is now the most affordable. Thanks to our built-for-the-cloud architecture, specifically our separation of compute and storage, we’re able to price storage at the same price as Amazon S3 – Snowflake’s cloud storage provider.

Why have we done this? Our customers continue to tell us how much Snowflake has changed the way they use data. But they’re understandably concerned about the cost to store much larger volumes of data in Snowflake. Therefore, we’ve changed our storage pricing to help customers focus on the much more important process: analyzing data for insights.

Secondly, we’ve added a quick and simple way to get up and running with Snowflake. Snowflake On Demand allows customers to start using Snowflake with just a credit card and a simple sign-up process via our website. Data users of all types can now experience the many benefits of Snowflake without friction or delay.

And thirdly, Snowflake has launched is newest deployment in the Frankfurt (EU) region. Organizations with EU headquarters, and multinational organizations with EU operations can now keep their data in the EU. This is huge for any organization keen to keep their EU data close to home, and any for organization that wants to advance their global data initiatives from region to region.

All of these offerings will be available in November.

At Snowflake, we love to hear what our customers have to say. Their input, and our hunger to deliver and evolve the best data warehouse available, drive everyone at Snowflake to serve our customers the best way possible. Snowflake’s technology, solutions and customer-centric strategy will never cease to evolve. We look forward to continuing our mission of helping organizations advance their operations, serve their customers and lead their industries with the insight derived from data without limits.

For more details on all of Snowflake’s newest initiatives, view Snowflake’s main announcement and our individual announcements on our price reduction, Snowflake On Demand and Snowflake’s deployment in Frankfurt.

As always, keep an eye on this blog site and our Snowflake-related Twitter feeds (@SnowflakeDB) for more interesting things about Snowflake, and for updates on all the latest action and activities here at Snowflake Computing.