Q&A with James Newsom: Why Snowflake Cortex Is the Most Secure Way to Put AI to Work on Your Data

In this Q&A, we sit down with James Newsom, Solution Engineer at Snowflake, to unpack how Snowflake Cortex and Snowflake Intelligence are reshaping the way enterprises think about AI, data platforms, and self‑service analytics. From unstructured documents to semantic views and agentic AI, James explains why Snowflake is no longer just a data warehouse -- It is the place where AI comes to the data, not the other way around.

 

Q: You often talk about “Day Zero Frankenstein” data stacks. What do you mean by that, and how does Snowflake change the game?

James Newsom:

Most organizations start with a “Frankenstein” collection of tools. They have a data lake here, a BI tool there, some custom pipelines, maybe a separate vector store, and then a separate AI or ML platform. You are constantly cobbling pieces together and hoping they talk to each other. That is what I mean by “Day Zero Frankenstein.”

Snowflake is already “all of it.” It is your data warehouse, your data lake, your AI and ML platform, and now your AI‑enabled BI layer, all in one secure perimeter. Instead of moving data everywhere, AI comes to the data. The LLMs run inside the Snowflake service perimeter, your data never has to leave, and inference is governed by the same enterprise-grade security as the rest of Snowflake. That is the core shift. You are not building a patchwork. You are starting with a unified AI data platform.

 

Q: A lot of people still think “unstructured data is too hard” or “we can’t do that in a data platform.” Is that still true?

James Newsom:

That is an old way of thinking, and the game has changed with Snowflake. Using AI to parse text and documents at scale is remarkably straightforward now. You can take millions of documents, emails, PDFs, or chat logs and, with Cortex Search and Snowflake’s built-in embedding models, turn them into structured, query able tables inside your warehouse.

The mindset shift is that your data warehouse is no longer just for cleaned, tabular data. You can now parse unstructured data into tables and treat them like any other dataset. That means you can build your entire warehouse using AI to ingest and structure data, instead of waiting for perfect schemas or perfect pipelines. The old roadblock of “we will deal with unstructured later” is gone. You start with AI and bring it back to the fundamental truth.  Everything starts with acquiring and modeling the data. And with Cortex Search powering Cortex Agents, you can deliver an LLM-enabled "chat with your data" experience that works across all of your data formats -- structured and unstructured alike.

 

Q: You talk about “AI/ML → Data Engineering → Data Lake” as a new perspective. How does that fit with open formats like Iceberg?

James Newsom:

There is a growing community that wants open, interoperable table formats. They want no vendor lock‑in and no architecture lock‑in. Snowflake participates in that through the Iceberg specification and other open standards. But the key point is that open formats must earn their place in your architecture.

I If you want to export your data to Iceberg or Parquet, Snowflake makes that straightforward. There is no real lock-in with Snowflake native tables. That said, managing your own data files on open formats is harder than letting Snowflake handle the complexity for you. With Snowflake managed tables, you do not have to wrestle with storage infrastructure, file organization, and security at the file level. Snowflake's architecture makes it far simpler to manage at scale, while still giving you full interoperability when you need it. Snowflake is one of the leading commercial platforms for Apache Iceberg, with native Iceberg table support and the open-source Apache Polaris catalog, so when true multi-engine interoperability is required, we have you covered.

In fact, we just made this even simpler. Snowflake now offers Snowflake-managed storage for Iceberg tables, currently in public preview. This means you get the openness of the Apache Iceberg format -- your data is store das standard Parquet files with Iceberg metadata -- but Snowflake handles the entire storage layer for you. No cloud storage buckets to provision, no IAM roles to configure, no file lifecycle to manage. You get a native Snowflake experience with all the platform features you expect -- time travel, replication, zero-copy cloning -- while your data remains in an open format that any Iceberg-compatible engine can read through the Horizon Catalog RESTAPI. It is the best of both worlds: fully managed simplicity with full open-format interoperability.

 

Q: Snowflake Intelligence is a big part of your story. How does it actually work for a business user?

James Newsom:

Snowflake Intelligence is Snowflake's AI-powered experience for the enterprise. It is powered by Cortex Agents, which can use leading frontier models from Anthropic, Open AI, Meta, Mistral, and Snowflake -- all hosted and running inside the Snowflake security perimeter. Your administrators control which models are available, and you can use whatever best fits your needs. The big idea is that you can ask questions over your enterprise data, documents, or tabular data in plain English, just like you would in a consumer AI chat, but with enterprise-grade security and governance built in.

Your data never leaves the Snowflake perimeter, and it is never used to train or fine-tune the foundation models. That is what makes this one of the most secure ways to use AI in the enterprise today. You can connect it to both structured and unstructured data, and it surfaces insights in natural language. This is AI-enabled BI. You are not just looking at dashboards. You are having a conversation with your data.

 

Q: How does “semantic views” fit into this? What is the “secret sauce” there?

James Newsom:

Cortex Analyst runs on semantic views, which are first‑class objects in Snowflake that you create with DDL. A semantic view includes metadata about what the facts are, what the dimensions are, what the relationships are, and even business vocabulary and synonyms.

When you say, “I want to build a semantic view,” you are essentially teaching Snowflake what your data means. Cortex Analyst uses that semantic layer to translate natural language into accurate SQL. That is the secret sauce. The text-to-SQL process is grounded in your specific data model, your business definitions, and your verified queries -- not generic prompts.

You can add verified queries -- gold-standard SQL examples -- so that any one can ask, “What were sales last month?” and get the same governed answer every time. It is not just self‑service BI. It is true self‑service analytics, where the semantic model is embedded as close to the data as possible.

 

Q: You mention “everything starts with acquiring the data,” but then you also talk about AI‑enabling your data. How do those ideas connect?

James Newsom:

There is a simple truth. There is no AI strategy without a data strategy. You need quality data to fuel AI. The medallion architecture, bronze, silver, gold, is still relevant, but now you are doing it with AI in the loop.

You integrate, you build into the warehouse, and you curate gold‑quality data for key domains. Then you AI‑enable that data. You use semantic views, Cortex Analyst, Cortex Agents, and Snowflake Intelligence. The difference is that now you do not have to focus on the UI. The business user just asks a question, and the AI surfaces the insight. The data platform becomes the engine for both BI and AI.

 

Q: How does Snowflake compare to other vendors when it comes to cost, elasticity, and “over‑engineering”?

James Newsom:

A lot of other stacks force you into capacity units that are hard to pause, hard to scale, and hard to understand. You end up over‑engineering because you do not have real elasticity. With Snowflake, we can instantly provision compute when a query is executed AND auto-suspend compute when queries complete. Compute is provisioned on demand when a query runs and auto-suspends when it finishes.  You can scale up and down programmatically, and you only pay for what you truly use.

As the platform has matured, performance improvements mean you are getting more done per credit over time. Snowflake Intelligence is consumption-based too -- you are not paying a flat per-user subscription fee whether someone uses the service daily or never logs in. With many competing AI chat products, you pay the same for a power user as you do for someone who never touches it. The Snowflake philosophy is straightforward: one platform, consumption-based pricing, and every capability of the platform is available to you without separate licenses. You invest in one AI data platform that grows with you.

 

Q: What does “AI‑enabled BI” actually look like in practice?

James Newsom:

Traditional BI gives you static dashboards that answer a pre‑conceived set of questions. AI‑enabled BI lets you ask, “Why did sales drop last month?” and then follow up with, “What about in the Northeast region?” or “Which products drove that change?” That is the power of LLMs. They are awesome at the “why” and the dynamic follow‑up.

With Snowflake Intelligence, you are not just seeing charts and graphs. You are having a conversation with an expert analyst. You can get to the “why” in minutes instead of weeks of back‑and‑forth with the BI team. That accelerates organizational maturity. Once you can ask anything, you start thinking about what you want to make happen, not just what happened.  Important to point out that most customer’s have a BI team that is often best equipped with the knowledge to help define and test the semantics that provide the crucial context for AI accuracy.

 

Q: You have been talking about Cortex Code CLI and agentic AI. What is the impact there?

James Newsom:

Cortex Code CLI is Snowflake’s AI coding agent, purpose‑built for the Snowflake data stack. It is a massive productivity multiplier. The path from idea to proof-of-concept can happen in a single sitting, POC to production takes days instead of sprints, and time to business value compresses dramatically. You tell Cortex Code what you want to build, such as “Create a semantic view of these tables” or “Build an agent that gives insight into customer churn,” and it generates the code for you.

When you combine agentic AI with the broader Snowflake platform, the impact compounds.  You can connect to machine learning workflows, and the cost to design, build, and run models drops significantly. Data science becomes more accessible -- you no longer need a specialized team with months of lead time to get value from ML. Any organization can experiment and develop viable models that show rapid business value using their current data engineering talent. Snowflake provides the model registry, feature store, and observability to support the full ML lifecycle, and Cortex Code accelerates every step of the build.

 

 

Q: What is the bottom line for organizations thinking about Snowflake Cortex and Snowflake Intelligence?

James Newsom:

Snowflake is t one of the most versatile technology investments an organization can make because you control what you spend, you do not have to commit to a big upfront outlay, and you can use it as much or as little as you need. It is an AI data platform that is consumption‑based, secure, and built for interoperability.

If you already have Snowflake, you already have access to the full capabilities of the platform. We do not license individual components, and you only pay for what you use. With Cortex AI, Cortex Code, and Snowflake Intelligence, we are building an AI‑enabled data foundation designed to help you answer the questions that matter, when they matter, in a way that is governed, secure, and scalable. That is the future of data platforms.