One of our current obsessions is how the introduction of generative artificial intelligence alters the nature of jobs, organizational structure, and management, sometimes in ways most people haven’t anticipated.

Microsoft’s Jared Spataro has an interesting perspective on this, as his role heading the tech company’s Modern Work & Business Applications team affords him advance access to many of the AI tools that businesses will roll out to employees in the months and years ahead. When we spoke with him earlier this year, he predicted that as a result of AI and other factors “the manager of the next even two or three years is going to look very different.”

Spataro will be a keynote speaker at the Charter Workplace Summit on Oct. 26—you can see more details and sign up to attend for free here. We caught up with him at Microsoft’s recent unveiling of new AI features in products including Windows and Microsoft 365, and are sharing that discussion here, edited for clarity, as a preview of what we’ll cover live in a few weeks at the Summit.

When we spoke earlier this year, you mentioned two ways in which the introduction of AI impacts management and leadership: One is managing across time and space, including flexible, hybrid working. The second thing was managing teams for machine-human augmentation as an additional skill required of managers. Building on that, are there other things that come to mind?

I'll dive a little bit into what we talked about last time. We're seeing companies actually really engage in projects that are helping me see the future a little bit more across industries right now. What people are starting to understand is that what we have here is a general-purpose reasoning engine. We've never had anything like that except for the human brain up to this point. The underlying assumption of firm design and team design has said, anytime you need a thinker, stick a person there. What I see as a fundamental new skill is thinking, no, you don't have to stick people there. You can sometimes stick machines there. That's causing for me a rethink of everything from the structural design of teams and organizations to process redesign to even what that culture looks like. People could feel very threatened if you're like, 'Well, people used to do that and now they don't. And how do I know I'm not next?' Those issues seem to be surfacing more than anything else I see right now. It creates a hotbed of innovation when it comes to how you think about an organization, but it creates a lot of tensions too, for leaders and managers to work through. They're figuring it out.

You just mentioned three things: organizational structure, process, and culture. Let’s start with structure. There's research suggesting that the introduction of AI flattens organizations, so there are fewer middle managers. And we know that some of the tools are able to upskill less experienced workers...

It's a trend. I don't know if we've seen enough for any of us to say that's it definitively, but if you think about what middle managers do structurally, their job is essentially to be a buffer, a mediator between the people below them that are meant to do the work and the people above them that are setting expectations and trying to give direction. Certainly AI is not at the point where it can do all that work. There's a lot of mediation to be done, but the flow of information up and down, that type of work can be done even much more effectively by machines. Some of the synthesis work that happens, summarizing what's happening, reasoning across and looking at options—that can be done. What I see happening is people reevaluating, is the way we've organized ourselves ideal or is it based on previous assumptions? And can we experiment? Just like you're talking about with a flatter organization where middle management that was meant as this combination of communication and control is not as needed or can play different roles. It's a very hot topic right now and in motion. But there's nothing definitive enough where I'd say it's definitely emerged right now.

Thomson Reuters surveyed lawyers and tax specialists about how they anticipated work would change. The majority of them said they believed lower-skilled workers would do more work that traditionally required expertise. But they thought there would be fewer of the lower-skilled workers and they themselves felt like their expertise would be more greatly valued. How do you make sense of that?

I have a little bit of a contrarian view here right now based on my work. There's a sense that lower-skilled jobs will be affected and impacted in major ways and the people who are sitting up high on the perches will not be impacted. I don't think that's right. What we're seeing is that there's a new skillset required to work with these machines. Almost anyone can learn it because it's no longer programming, it's more domain knowledge and how to work within a particular domain. Lots of people can learn that. So you could, if you wanted to be provocative, say that it has the potential to change the balance of power within an industry or an organization. I don't think it's going to play out the way people have imagined in their heads having not touched the technology.

The Thomson Reuters survey participants are lawyers and tax specialists who are not necessarily using the technology. And, to your point, McKinsey research from this summer concluded that people with the highest education levels have much greater exposure to automation since the arrival of generative AI.

I can absolutely understand that.

How do you think about skills in this context? How can organizations prepare workers for what you're saying could happen?

I don't have a perfect model worked out in my head, but I'd say this: economically what gets valued tends to be the scarce resources When expertise was a scarce resource, when thinking was a scarce resource, when solid reasoning was a scarce resource, which is what the field of economics has thrust upon us over the last 50 years and with specialization—a la Adam Smith but in spades—then we tended to create a worldview that was all about skills or about getting those really scarce resources and monetizing them.

I think we're going to be seeing a different world. There still will be scarce resources for sure, but will reasoning over complex data sets be one of them? It's not clear to me that that will be a scarce resource going forward. We'll economically get to the point where that is cheaper and cheaper and cheaper.

So now to get to your question, what skills are valuable? What I'm seeing in some ways is some very interesting combination of generalized management skills where everyone becomes a generalized manager, meaning these generative AI tools essentially put an entire staff at the beck and call of anyone. That's a new skillset. How you generally manage, how you delegate, how you pass judgment on what is brought back to you, how you synthesize across things. That's a skillset for sure. It's more general.

How you prioritize....

How you prioritize, all that type of stuff. Then there are some domain specific-—I don't know if I'd call them skills—but knowledge and understanding enough to use the tools to get real results. What I've found is that the more I know about a particular issue, topic, problem set, the better the results are from generative AI. I can ask very sophisticated questions. I can direct it in different ways, for example, 'We've gone down this path, don't do that. I want you to go deep here. I want you to make sure that you bring these things to the forefront.' That requires domain knowledge. But that combination is almost at odds with each other and that's really interesting: a generalized skillset, but the ability to have depth in key areas where you have to make decisions.

One example that we've been thinking about is if you're using Code Interpreter, if you have statistics background, you can actually use it much more effectively, even though it makes statistics accessible....

That totally makes sense to me.

You talked about two other areas where management is changing as a result of AI. One is process and one is culture. How is process changing?

That one is definitely in motion in a very exciting way. Right now, the cutting-edge business leaders out there are picking core workflows, core processes. They're asking themselves what portions do humans do? What portions do machines do? Where do I want my humans spending their time? Where do I want machines spending the time? They are already doing business-process engineering, mostly pilots. Sales processes, support processes, operations-oriented processes, finance-oriented—there's so much that happens. If a firm were a living organism, there's so much that happens to keep the organism going. Finance has reporting cycles, salespeople got to bring it in. There's so much that happens there. There's lots and lots of work being done by function right now on those key processes. Forecasting processes, close processes....

So the work is to actually expose your process, who's involved with each step...

In some ways it feels like 1980s, 1990s process reengineering. But this time you have a really new tool set. How could I use that to close my books every quarter, for example?

What about culture? How is leading an organization's culture different with AI?

There are a couple of things that are on my mind. This is still forming right now too. The speed of AI is having a major impact where it is starting to play a big role in industries to change culture. One example comes from our own company, where we just announced Nov. 1 is the date that will go generally available for this Microsoft 3,65. That's less than a year from the release of ChatGPT. We used to develop products on five- and seven-year cycles. It took us thousands of man-years to do this stuff. Speed is having a major impact on our culture because we're learning. As a simple example for us, you've got to think about how you move quickly without all the information, but knowing that you have resources that can help you along the way in ways that you haven't been helped before. That's a change for many industries to think about what that culturally means. There will be some other cultural things. The one that certainly rears its head would be culture associated with labor markets and labor organizations. What does this mean for them? That's particularly interesting from my perspective.

What's the question there?

Organized labor, is it on the side of AI or not? Is it against AI entirely or does organized labor believe that AI is a tool that needs to be democratized and for every worker and so don't apply it unevenly? Does it change the balance of power between organized labor and others? There's some pretty important kind of industry and social things happening around culture there.

We're seeing that in Hollywood...

Exactly. You see it right now in auto workers and Hollywood. It's definitely an issue.

I just read an interview with DeepMind co-founder Mustafa Suleyman where he was talking about how there are three stages of AI. One was AI recognition, where you showed AI lots of photos of cats and then it learns to identify a cat. The second one was generative. The third one is interactivity, and it's where you have agents that go out and do stuff relatively autonomously and then come back to you. What do you think of that?

It's interesting. Here's what I think of immediately. Most people don't understand that the heart of generative AI is really a thinking machine. They think of it as producing an image or producing text or a smart answer, but you don't do any of those things without actually being able to think about what's happening. So that blurs the line a little bit between the second and third phases, but nonetheless, thinking about it as a framework, definitely today what we introduced was probably the industry's biggest single step to having your copilot—your agent, if you will—that is meant to be a companion in the overall sense. In my domain, we're going to call it an assistant.

You bet. That seems like exactly where we're headed. We're taking a copilot approach to it, meaning it's not an agent that will go do a bunch of stuff on your behalf and then say, 'Hey, I'm just informing you, here's what I did.' That goes back to some cultural questions of what can an agent do. What's it empowered to do without you explicitly giving consent? There's a lot to work out there for sure, but our perspective is right now, this will be an assistant that is meant to help you with the things that are most important to you as opposed to doing a bunch of inter-agent talks.

What's the distinction there?

That a human must take explicit action. And for us, our copilot doesn't send emails for us. Our copilot doesn't create documents for us. Our copilot must have a human action to send them out into the world.

Companies are looking to get their leaders up to speed on generative AI and business strategy—how would you recommend they do that?

I still would put number one on my list that people just need to use the thing. Your personal experience is going to be more valuable than anything anybody has to tell you. Maybe number two that is on our collective list as a team—going back to your questions about leadership and management—we have been recently digging into some good work that's been done over the last 10 years out of Harvard on adaptive leadership.

Basically the framework is this: there are technical problems and there are adaptive problems. Technical problems are, as you would guess, those that have been solved before and it's a matter of applying technique. Adaptive problems are you're walking into the dark, you don't really know what's next, and you've got to be able to experiment quickly and move into these areas. For me, it seems like a great framework for the management and leadership questions you were asking me earlier, when also fused with your own experience. It's been there for 10 years kind of sitting on the shelf, and from my perspective, hasn't really seemed to take off. But I just think it might be a framework that has finally met its time.

Sign up to see Spataro live at the Oct. 26 Charter Workplace Summit here, with both virtual and in-person options.