This must be satire, surely?
On LinkedIn, 'thought leaders' are continually re-inventing themselves as experts in The Next Big Thing, prognosticating about how The Next Big thing should be applied to whatever random things they can think of. Jeff Sutherland seems to think a mashup of 'AI' and Scrum development would be a good idea. I couldn't tell whether his post was satire, or whether it was inspired by some mind-altering substances.
If Scrum team is ten times faster than a dysfunctional team and AI makes teams go four times faster, then when both use AI the Scrum team with still be be 10 times faster than the dysfunctional AI team. Actually, what I am teaching today is not only for developers to use AI to generate 80% of the code and be five times faster. [...] But it you make the AI a developer on the team you will get a synergistic effect, potentially making the team 25 times faster. I am supervising a Ph.D. these right now that is trying to prove this.
As someone with a decade of actual software engineering experience, I'm confident in saying that Sutherland described is exactly what the industry doesn't want. The 'synergistic' effect would actually result in substandard software, with potentially unmanageable technical debt, that's delivered far beyond any reasonable deadline.
Over that decade, I've seen software being delivered months, and sometimes years, later than it otherwise would have been, because of the excessive amount of overhead and disruption that Scrum entails. The most obvious reason for this is meetings are very disruptive in software development, and one meeting can wreck productivity for an entire morning or afternoon, which then means a Scrum team might have roughly half the capacity of a team that's simply working off a Kanban board.
Another problem is that Scrum's unhealthy fixation on 'sprints', 'velocity', 'accountability' and daily status updates makes it an unsustainable method of working, and we end up with a high turnover of developers through burnout. This, in turn, degrades the quality of long-term support for whatever software is being produced.
What about 'AI'? I've posted about this before, after my colleagues and I experimented with it and discussed its possible uses as a tool. There are reasons I won't use ChatGPT, or somesuch, to generate my code. Or perhaps I would, for disposable bits of JavaScript. ChatGPT can generate roughly what a junior software developer would produce, and the results can be marginally better when we're specific in what we ask. No software engineer, however, would use AI-generated code without review or a heavy amount of refactoring.