Why 2024 is the time to rewrite your engineering playbook

Why 2024 is the time to rewrite your engineering playbook

Advancements in AI consumed our attention and drove massive business considerations in 2023. Seemingly overnight, Generative AI (GAI) went mainstream – quickly becoming more deeply embedded across organizations and in everyone’s day-to-day work. Executives recognize the potential value GAI can bring to their organizations with 74% seeing at least one way it will benefit their employees, according to our September 2023 U.S. Executive Confidence Index. 

For engineering leaders, the focus in 2024 will be on how to best harness the potential of this technology and navigate its associated challenges - from rewriting their playbooks and establishing the right technical building blocks, to identifying the talent needed to fully leverage AI in their products and processes. Here are three ideas to prioritize this coming year.

Big Idea #1: Reevaluate what elements of your tech stack need to be owned.

As engineering organizations start to manage more AI models, how they think about their tech stack will need to change. Smaller companies establishing basic infrastructure will have a different approach than large companies already utilizing hundreds or thousands of AI models across different processes and products.

“As you mature on your AI journey, you’ll have to consider what to build and buy. Use ROI to guide your decisions. How much will you need to invest to build it? Do you have the right talent in-house to do it? Do you have the compute resources to train and serve the models?” – Ya Xu, VP of Engineering and Head of Data and AI, LinkedIn.

Think about this in terms of the different layers of the tech stack. Foundation models, such as large language models (LLMs), sit at the bottom layer – only a handful of companies have the resources to build at this level; so leveraging state-of-the-art LLMs that others have built usually makes sense. At the top of the stack is the application layer – which is the hardest to buy because it is where the product directly interacts with users to help them accomplish their goals. There’s a thick middle layer that involves orchestrating the workflows (e.g. routing to the right LLMs for the right tasks), managing versions of prompts, responsible AI detection and mitigation, and many other functions. This middle layer tends to be deeply integrated with the rest of an organization’s development stack. While some components can be bought, for many others it is not always easy to buy and integrate out of box solutions, though there are plenty of open source options to build solutions on top of.

When considering the risks and benefits of building compared to buying tools for the tech stack, evaluate when and where working with third parties can provide more scale and leverage. Last but not least, when looking for outside capabilities, be diligent about vetting third parties from both a security and trust perspective. Find partners that adhere to similar security principles and policies.

Big Idea #2: Keep engineering teams a step ahead by investing in skills.

The skills engineering teams need to succeed are rapidly changing as AI advances. Over the past year, we’ve seen entirely new engineering skills and roles emerge, such as prompt engineering. The iterative nature of AI means that teams will need to continuously learn and evolve to keep up with innovations. Over time, AI engineering may not be a distinct discipline – it might become a necessary part of every software engineer’s skill set. For example, when we launched our AI-powered Premium experience, many of our engineers had to learn complex prompt engineering skills quickly. Reminiscent of traditional coding with Java or C++, designing a quality prompt can require careful structuring, creativity and a lot of testing. By helping our teams develop these new skills, we’re empowering our engineers, and our organization, to move quickly to leverage new opportunities.

Developing the right technical skills to build with AI will be just as important as helping teams develop the soft skills they’ll need to succeed. Our research shows that as much as 96% of software engineers’ skills – like coding and programming – may be augmented by GAI, emphasizing the people-oriented aspects of their job. Tech professionals who have developed soft skills – like communication, teamwork, problem-solving, or leadership – in addition to hard skills get promoted more than 13% faster than employees who only have hard skills. Helping teams adapt to the changing environment will be as much a cognitive transformation as a skills transformation.

“It’s an exciting time to be a leader. AI is presenting itself as a new type of disruption. AI isn’t an episodic wave of change – like mobile or the Internet; it’s creating change on a more continuous curve,said Aarathi Vidyasagar, VP Engineering, Talent Solutions, LinkedIn.As a leader, it's critical that you look at the evolving industry and ask 'What skills does my team need to thrive, not just survive?'”

Big Idea #3: Scale compute infrastructure strategically.

As AI continues to advance, there will likely be significant changes to the digital products and services we engage with as both individuals and within our engineering teams. This may include integrating AI into existing products or creating entirely new products that are reimagined around the capabilities of AI. The compute capacity organizations will need to support this shift is expected to grow exponentially in the coming years. 

Take a look at the Top 500 list, which rates the fastest supercomputers in the world; it illustrates the scope of the power AI will require. This year, Microsoft’s Eagle was one of the top three supercomputers, leaping ahead of traditional front-runners as Microsoft quickly raised its computing capacity to support the rapid growth of its AI applications.

When developing a plan for compute resource requirements beyond 2024, assessing how capacity and infrastructure architectures must evolve is crucial. Organizations will need to take stock in what is capable with their current hardware and infrastructure, and then determine if multi-layered architectures are best suited to their needs or if the newest hardware is achieving the kind of compute capabilities to opt for simpler approaches. Scaling compute resources to support diverse computing needs and the increasing use of AI in work processes may involve a combination of cloud-based and on-premises solutions. Leaders need to take the time to test, analyze, and appropriately adjust infrastructure based on the requirements necessary to deploy these AI-first experiences that will shape the future.

“The current common ways of building AI products, especially the search and recommendation stacks, will undergo a dramatic shift enabled by the compute capabilities and architectures being developed today. As AI becomes more prevalent, it will help us build new products supporting the myriad of ways we work, and the everyday tools we use to further our careers, interests and businesses.” – Souvik Ghosh, Distinguished Engineer, AI, LinkedIn.

Innovate responsibly 

As organizations venture into new and more challenging applications for AI, there is an inherent uncertainty about how quickly to adopt changes and make investments to push the possibilities  of the technology. AI is constantly evolving with its creative applications expanding and the need to further test and evaluate its capabilities top of mind for engineering teams. 

Regardless of the pace at which an organization chooses to explore or implement AI, it’s essential to keep people at the center of those decisions. Consider establishing guardrails and guidelines that help teams build with AI responsibly – and ensure there is human oversight in assessing the risks and potential for harm. Now is the time to make key decisions about putting the right principles, infrastructure and skills in place to innovate with AI in 2024 and beyond.

Ken Vermeille

CEO at Vermillion Sky | Mobile App Expert: Helping Founders Build Apps with Positive Revenue & Cash Flow | Father of 3 | Gamer

11mo

2024 is the year programmers become AI augmented cyborgs. What a time to be alive.

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics