Connect with us

Tech

Anthropic’s Cat Wu says that, in the future, AI will anticipate your needs before you know what they are

Published

on

With the tech industry singularly focused on AI models, Anthropic is having an exceptionally good year.

The company may soon pull ahead of its main competitor, as it looks to raise tens of billions of dollars in a funding round that would put its valuation at some $950 billion (OpenAI was valued at $854 billion in its March round), and business customers increasingly express a prefererence for Claude over ChatGPT. A recent report showed Anthropic recently outpaced OpenAI among business customers, quadrupling its market share since May 2025.

Cat Wu, Anthropic’s head of product for Claude Code and Cowork, has been a key figure in that success. Since joining the company in August 2024, Wu has helped shepherd Claude through a critical phase, leveling it up from a purely informational chatbot to a coding tool and beyond. Wu, who oversees the development of new features, is frequently paired with Boris Cherny, a core member of Anthropic’s technical staff and the creator of Claude Code, leading the pair to be characterized as Anthropic’s “Batman and Robin.”

Wu sat down with me at last’s week’s second annual Code with Claude conference in San Francisco, where she discussed how she thinks about product strategy, and how she hopes the experience of using Claude will change in the future. 

This interview has been edited for length and clarity.

When you’re looking at product strategy, how much of it is reactive to your peers or your competitors? Do you think about that at all?

The main thing that we design for is staying on the exponential, so I think, across our team, we instill in everyone the lesson that AI will just continue to get better. For us, we just need to stay at this frontier. We don’t think about competitors. I think if you do think about competitors, you end up being, like, perpetually two weeks, or like, a month behind how fast you can execute. And so it’s normally not the best way to stay at the frontier.

Anthropic released at least six models last year and has already released almost as many this year. Do you expect this pace of development to continue?

Our hope is that it continues (laughing). I think the models are still improving at a very steady pace, and so we should be able to keep sharing those with our users. I think the deployments might look a bit different—like how we handled Glasswing, but as much as possible, we want this intelligence to benefit as many people as possible, and it has to be handled in a very safe way, which is why we handled Glasswing [in the way that we did].

[Glasswing is an initiative that Anthropic launched in April that invited a small consortium of partner organizations — including companies like Amazon, Apple, CrowdStrike, and Microsoft — to gain access to its new cybersecurity model, Mythos. Unlike many of Anthropic’s other AI models, Mythos is not being given a general public release. The company has claimed that it fears the model — which is designed to scan codebases for software vulnerabilities — is too powerful, and could be weaponized by bad actors.]

You said in a previous interview that the future of work is basically staff managing fleets of agents. It seems like that could eventually lead to a situation where the agents are better at the job, or know the job, better than the human.

I think it is extremely hard to manage agents if you can’t do the job yourself. I think the managers still need to be experts in their domain. It’s a new skill set that a lot of people are going to have to learn, but managing agents is actually very similar to being a manager of people, in the sense that you have to understand, like, why did the agent make this mistake? Did it misinterpret my instruction? Was my request under-specified? You have to have the ability to debug it.

It does seem like the long term goal is to cut down on team size, though. Because if you have agents doing a job, then you don’t need an intern, right?

Ideally, I think the idea is that everyone can get a lot more done. I think that, for everyone’s job, there’s always this percentage of it that’s really tedious. For me, it’s responding to emails. I think everyone has this part of their life…So my hope is that it [the AI agents] actually does that, and then everyone has, like, all these cool things that they will want to build [in their spare time].

What are you guys most excited about in the next six months?

I think the next big thing is proactivity. Last year we were in this world of synchronous development. Right now, people are shifting to routines, so like automating, for example, responses to customer support tickets. And I think the next step is that Claude understands what you work on, and just sets up some of these automations for you.

When you purchase through links in our articles, we may earn a small commission. This doesn’t affect our editorial independence.

>

Continue Reading

Tech

Notion just turned its workspace into a hub for AI agents

Published

on

Productivity software maker Notion is stepping into the agentic era.

In a live-streamed product announcement on Wednesday, the company, known best for its collaborative note-taking app, introduced a new developer platform that extends the capabilities of its custom AI agents, connects with external agents, and allows teams to build automated multi-step workflows that can pull in data from any database.

By building an orchestration layer — a system that coordinates AI work across multiple tools and data sources — Notion is positioning itself as more than a note-taker with AI features and instead as a hub where people and agents can collaborate across tools and databases.

In February, Notion first launched its Custom Agents — AI teammates that handle repetitive tasks, like answering frequently asked questions, compiling status updates, and automating workflows. Since then, Notion customers have built over one million agents, the company says.

However, these agents had limitations. They couldn’t connect with external data or use custom logic. External agents that companies used also didn’t have a way to connect with the Notion workspace. Teams had to work around these problems by using third-party automation platforms or writing their own scripts that run on their own infrastructure.

“It’s true that, historically, Notion hasn’t been the most developer-focused platform,” said Ivan Zhao, Notion co-founder and CEO, during the livestream. “But things are changing.”

Image Credits:Notion

Now, Notion will allow teams to deploy their own custom code. With its new Workers, Notion’s cloud-based environment for running custom code, customers can write their logic and deploy it to a secure sandbox (an isolated environment that keeps the code from interfering with other systems). This allows teams to do things like sync their data into Notion, build custom tools, and trigger work with webhooks — which are automated signals that kick off actions when something happens in another app — without needing to rely on external infrastructure.

You don’t even have to write the code. The company points out that your preferred AI coding agent can do it for you.

The Workers will use the same credit system as Custom Agents, but Notion is making this free through August, so developers can experiment.

Syncing external data sources is also a part of the Notion Developer Platform. Powered by Workers, the database sync feature can pull in data from any database with an API. That means you could access data from places like Salesforce, Zendesk, Postgres, and others within your own Notion databases — and keep the data current.

Zhao noted that this means that Notion’s users can now “use your Notion database as a sheer canvas to power both your workflows and your agents.”

Image Credits:Notion

Workers can also build agent tools with custom logic, for those times when connecting with a third-party via MCP — short for Model Context Protocol, an emerging standard that lets AI tools connect to external data and services — isn’t enough.

Another addition allows Notion’s users to chat directly with external AI agents they use, assign them work, and track their progress, as if they were one of Notion’s own custom agents. At launch, Notion says that Claude Code, Cursor, Codex, and Decagon are supported partner agents, but it plans to add more.

There’s an External Agent API, too, if teams want to connect their own internal agents with Notion, like those they’ve built specifically for their company’s needs.

Image Credits:Notion

Developers and agents interact with Notion’s new Developer Platform via the Notion CLI, a command-line tool for developers, available on the company’s Business and Enterprise Plans.

The Developer Platform represents a shift in strategy for Notion as it becomes more of a programmable platform than just an application, setting it up to compete with other workflow automation platforms. As businesses increasingly look to automate knowledge work and build internal AI systems, a platform that ties together agents, custom code, and live data in one place starts to look less like a productivity app and more like core infrastructure.

It also follows the broader trend among AI companies, which have been moving beyond the AI chatbot to offer agentic tools that can take actions across different software platforms.

“Any data, any tool, any agent — that’s the big picture for the Notion Developer Platform,” Zhao said.

When you purchase through links in our articles, we may earn a small commission. This doesn’t affect our editorial independence.

>

Continue Reading

Tech

Musk’s xAI is running nearly 50 gas turbines unchecked at its Mississippi data center

Published

on

Elon Musk’s xAI is running nearly 50 natural gas turbines at its Mississippi data center, power plants that the state is currently not regulating thanks to a loophole.

The power plants are considered “mobile” by the state of Mississippi because they are sitting on flatbed trailers, thus allowing them to dodge to air pollution regulations for one year. The NAACP, which has filed a lawsuit on behalf of residents in the area, says the unchecked emissions from the turbines is worsening air quality in an already polluted region. This week, it asked the court for an injunction against xAI.

At issue is the “mobile” nature of the turbines. The Southern Environmental Law Center, which filed the lawsuit on behalf of the NAACP, says the turbines are being operated in violation of federal law, which says that power plants mounted on a trailer can still be considered stationary and subject to air pollution regulations.

XAI has been granted permits for 15 of its turbines. A Greater Memphis Chamber of Commerce press release previously said that “about half” of the 35 turbines in operation in May 2025 would remain on site. However, xAI has continued to install more. Currently, it’s operating 46, according to a local news report.

>

Continue Reading

Tech

TIOBE Index for May 2026: R Ascends as Statistical Tools Consolidate

Published

on

May 2026 TIOBE Index keeps Python #1 as Java edges past C++. R climbs to #8, and Paul Jansen says statistical tools are consolidating around Python and R.

The post TIOBE Index for May 2026: R Ascends as Statistical Tools Consolidate appeared first on TechRepublic.

>

Continue Reading

Trending

Copyright © 2017 Zox News Theme. Theme by MVP Themes, powered by WordPress.