MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
development
Search

5 ways AI will change the software development life cycle

Monday December 2, 2024. 10:00 AM , from InfoWorld
Considering the scaling history and trajectory of generative AI models (specifically large language models, or LLMs) specialized for coding, the software development life cycle (SDLC) is ripe for disruption. Not because we’re all going to be replaced by machines, but because there are so many areas in the SDLC that are perfect fits for AI and again, more specifically, LLMs.

In a recent whitepaper, we at Crowdbotics explored what a ground-up revamp of the SDLC would look like with AI designed in from the start. To do this, we made three bold assumptions about the future:

Code generation would become near instantaneous and free.

Natural language would evolve into the primary programming interface.

Humans would act as verifiers rather than doers.

We were also keenly interested in understanding the tangential impact of AI on the SDLC beyond just hands-on keyboards. To do that, we looked at five key aspects that surround, are controlled by, and will ultimately be influenced by the future SDLC:

Speed: The two-week sprint is dead.

Teams: Humans shift from creation to verification.

Intelligence: Knowledge capture and access become automated.

Resources: Global development teams form a relay.

Consumption: Demand for software surges.

Looking at the broader impact of AI in the SDLC beyond the hype of “coding is dead” is crucial because it opens so many new doors of possibility, exploration, disruption, and, ultimately, opportunity.

Here we’ll briefly discuss the five properties of the SDLC that will change with AI.

Speed: The two-week sprint is dead

The current two-week sprint model, popularized by scrum in the early 2000s, has been an industry standard for balancing productivity and adaptability in software development. Two weeks essentially became the handshake agreement between engineering and product because it was considered enough time to build something worth having an alignment meeting about. So what happens if you no longer need two weeks but one? Or not even weeks but only days? Or even hours?

We glimpse this today, as AI coding assistants can increase developer speed by 50% or more while streamlining non-coding tasks such as reporting and project management. Some estimates suggest that AI could replace up to 80% of project management activities by 2030.

Given these improvements, the traditional two-week sprint will become too long. As development time is reduced and reporting tasks become expedited, we can expect a shift towards shorter, more dynamic sprint cycles.

Teams: Humans become verifiers, not creators

The role of human software engineers will shift dramatically as AI models generate code quickly and cost-effectively. The best I’ve heard it summarized is that software engineering (actually programming) will become like electrical engineering is today—still highly needed but much more niche. But that’s not going to happen today or even tomorrow. The code generation we have today is useful, but it’s only performing basic tasks. We have yet to see full code generation of a large-scale enterprise application. This isn’t a knock against current technology. Every innovation starts this way, and we should look at it as tea leaves rather than an end product.

Reading these tea leaves, it is reasonable to believe that these new team structures will consist of multiple AI agents, each with a specific role in the software development process. For example, one agent might define the project scope and objectives, while another focuses on project planning and quality analysis. Human engineers will oversee this process, providing input and verifying the AI-generated results.

Intelligence: Knowledge capture and access become automated

Jira, Slack, Confluence, Workday, Dynamics, Teams. Docs… This is knowledge management, also known as the bane of any developer’s existence. Capturing, storing, and making available the wealth of content created during the software development process is daunting, time-consuming, budget-consuming, and often done very poorly. Because most of this information is captured and stored as text, it’s a ripe area for LLMs to step in and help automate and clean up the process.

Knowledge management basically consists of two functions: knowledge capture, i.e. determining how you effectively and efficiently capture knowledge, and knowledge access, i.e. determining not only how you offer access to knowledge but also how to make sure people access it. While both capture and access are interesting and important, I find the possibilities for access most promising because, with generative AI, you can make all of this data and context proactively applied rather than relegated to access only. As you capture information in an LLM, you can extend that context model to other applications.

A current example we’re excited about is the ability to share GitHub Copilot context models between developers as they hand off work. Imagine picking up code someone just finished and also picking up all of the context they generated while writing it, allowing the next GitHub Copilot session to pick up where a previous one left off. As with the previous section, this tea leaf offers a glimpse of what the future could hold.

Resources: Global development teams form a relay

We’ve been running global software teams for years, but this globalization was driven mainly by access to lower-cost resources, not by a strategy to maximize efficiency. Sure, you may have a Team B working on the other side of the globe that can wake up and view a status report of what’s been done by Team A, but the work usually stops until Team A is back online. Cool, but not game-changing. What’s nice is that we live on a sphere that turns, and if you slice it into three parts, it’s always daytime for someone.

Follow-the-sun delivery is the epitome of software development efficiency in which global teams sequentially and indefinitely “hand-off” work to the next team as the world turns. It was tried years ago by IBM and others, but with no great success.

Why? There are a multitude of reasons, mainly context transfer and coordination costs. If one team is blocked due to a poor handoff, the whole process stops until the previous team is back online, effectively blocking two teams in the interim. As we outlined earlier, capturing context is a very good fit for AI, and using AI to update and share this context consistently should dramatically lower the probability of a team stalling after a handoff.

What does this all mean? Development resources can truly be global—both from a physical location perspective and from a temporal time perspective.

Consumption: Demand for software surges

As AI reduces the cost and effort it takes to develop software, it may seem intuitive to expect a decrease in overall development activity. The opposite is likely to occur, however, thanks to a phenomenon called the Jevons paradox.

Originally based on an economic study of the demand for coal, in Stanley Jevons’ book The Coal Question, the Jevons paradox says that demand for a resource will increase when new technology makes that resource more efficient. Why is this a paradox? Time and time again, businesses, governments, and other organizations have expected efficiency to lower underlying commodity consumption rates when, in reality, they go up. Energy is a great example: We consistently expect that if we become more energy efficient, we’ll lower energy consumption, but quite the opposite is true.

This paradox suggests that as software development becomes more efficient and accessible, demand for new software applications will surge. At least four factors contribute to this increased demand:

Increased development speed: Faster software creation enables more projects to be undertaken.

Lower costs: Lower development costs make software solutions viable for a broader range of applications and organizations.

Expanded domains: AI enables software to penetrate new areas previously considered impractical or too complex.

A larger talent pool: AI tools will make software development accessible to a broader range of individuals, further driving innovation and creation.

This point about consumption encapsulates the previous four, which is to say that all signals point toward exponential growth in demand for software. AI will play a central role in both stimulating and meeting that demand. But it’s not enough to say AI will play a central role and then sprinkle it in where we can; it’s critical to fundamentally rethink how we build software with AI at the core of every process, tool, and methodology—throughout the SDLC.

A vision to be tested

The whitepaper was a theoretical position piece and needs to be tested. Still, we are already beginning to see the hints of a new SDLC coming to light. With new AI dev tools coming online daily and the incredible ability of insurgents to challenge (and win) against large incumbents, change is coming.

As we mentioned before, the factors outlined here and in the whitepaper are tea leaves. We’re interested in hearing what other developers think. We encourage you to reach out, and to check out some of our other research topics, at crowdboticsresearch.com.

Cory Hymel is vice president of research and innovation at Crowdbotics.



New Tech Forum provides a venue for technology leaders—including vendors and other outside contributors—to explore and discuss emerging enterprise technology in unprecedented depth and breadth. The selection is subjective, based on our pick of the technologies we believe to be important and of greatest interest to InfoWorld readers. InfoWorld does not accept marketing collateral for publication and reserves the right to edit all contributed content. Send all inquiries to doug_dineley@foundryco.com.
https://www.infoworld.com/article/3609988/5-ways-ai-will-change-the-software-development-life-cycle....

Related News

News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Current Date
Dec, Thu 5 - 03:43 CET