The Amazon Web Services (AWS) Summit in London on 24 April 2024 was all about scaling Generative AI solutions. Here are two questions to ask yourself before you try.
SwissCognitive Guest Blogger: Chris Sherrington – “AWS Summit 2024 – Scaling your AI Executive Toys”
This year’s Amazon Web Services (AWS) Summit in London was an excellent opportunity to catch up on the progress in Generative AI (Gen AI). The resounding message was clear: if 2023 was about Gen AI proofs of concept (PoCs), then 2024 is all about scaling up.
Before embarking on the journey of scaling any Gen AI PoC, you need to correctly answer two key questions. First, do we have the right approach to data and cloud infrastructure, the foundations for our AI initiatives? And second, are we actively ensuring that we scale something that truly adds value? Let’s take a look at each in turn.
Cloud and Data Capability
Public Cloud is the obvious place to turn the talking black box that wowed your CEO into something that could do things for a customer. Hence it was no surprise the Summit focused on demonstrating the power of Gen AI in AWS’s ever-expanding toolbox.
AWS Bedrock does a great job mounting several Foundational Models (FMs) in a way that makes them consumable and modular. These digital brains can be transformed into AI agents capable of choosing options and taking action. Native serverless functions like message queues identify providers, and API gateways give these brains the limbs and muscles to do real work.
But things could go awry if your organisation still perceives software as a capital asset. Gen AI is more like a perishable good, subject to evolution and demanding constant adaptation.
Thank you for reading this post, don't forget to subscribe to our AI NAVIGATOR!
Anything you build will be outdated in a few cycles – right now that’s months. Plus, everything in the cloud rots faster to begin with. If your organisation has never built and managed a digital product, you will struggle with the pace of Gen AI – even to just maintain the status quo.
***
To get Gen AI right, you need to invest not only in building a product but also a growing capability of people, skills, and tribal knowledge. Think of that investment as the cost of a gym membership. Embrace it as a lifestyle change. And don’t forget the basics of cloud engineering – that’s as big a mistake as skipping leg day at the gym.
Your Gen AI will obviously be powered by your data, but your data may not be ready. I’m not talking about sophisticated and neatly labelled training data. I mean fundamental business data: lists of customers, product catalogues, employee credentials, and so on. Clean, high-quality sources held on suitably performant platforms. Do not underestimate how important that is: the point was made and made again in every use case presented at the Summit.
Useful AI products require useful data products (or as close as you can get with your current architecture). If your data strategy is as basic as a reporting strategy on top of an application strategy, you will struggle to feed your AI capabilities.
Your data supply chain is the bottleneck that will limit your scaling. Try to first scale in an area with teams that care about providing good data directly from their functions – and help them do it better still. But there is a caveat.
The trap of scaling Low value
The path of least resistance often leads to the place of lowest value. Many Gen AI PoCs were done at the margins of their enterprises due to safety concerns and the organisational inertia that any intrapreneur knows too well. There is a real risk of scaling something that is not worth it.
For example, I see a lot of HR-related use cases: chatbots that help employees understand policies or book time off. Using your employees avoids the discomfort of subjecting your customers to the new technology.
But that is also exactly what every HR system SaaS provider is working on. Unless you are an HR system SaaS provider, you are not learning anything useful from that Gen AI experiment. Something closer to how you create value for your customers would be a better choice.
Do not entertain sideshows just because they tick the right tech boxes. If you cannot describe the customer-relevant value streams, you lack sufficient strategic situational awareness. And if you don’t know when you are winning, why are you playing?
Key take-aways
Mature customers stand to gain a lot by productising FMs with AWS Bedrock. However, realising that potential requires a technical focus on modern data strategy. And alongside, you need a clear connection between your enterprise strategy and real-world value streams to see which areas are worth pursuing. Without that focus and clarity, your Gen AI PoCs will remain in the toybox.
About the Author:
Chris Sherrington has 20 years of experience in Enterprise Architecture, Technology Strategy and Innovation Leadership, making new tech work with old tech.
The Amazon Web Services (AWS) Summit in London on 24 April 2024 was all about scaling Generative AI solutions. Here are two questions to ask yourself before you try.
SwissCognitive Guest Blogger: Chris Sherrington – “AWS Summit 2024 – Scaling your AI Executive Toys”
This year’s Amazon Web Services (AWS) Summit in London was an excellent opportunity to catch up on the progress in Generative AI (Gen AI). The resounding message was clear: if 2023 was about Gen AI proofs of concept (PoCs), then 2024 is all about scaling up.
Before embarking on the journey of scaling any Gen AI PoC, you need to correctly answer two key questions. First, do we have the right approach to data and cloud infrastructure, the foundations for our AI initiatives? And second, are we actively ensuring that we scale something that truly adds value? Let’s take a look at each in turn.
Cloud and Data Capability
Public Cloud is the obvious place to turn the talking black box that wowed your CEO into something that could do things for a customer. Hence it was no surprise the Summit focused on demonstrating the power of Gen AI in AWS’s ever-expanding toolbox.
AWS Bedrock does a great job mounting several Foundational Models (FMs) in a way that makes them consumable and modular. These digital brains can be transformed into AI agents capable of choosing options and taking action. Native serverless functions like message queues identify providers, and API gateways give these brains the limbs and muscles to do real work.
But things could go awry if your organisation still perceives software as a capital asset. Gen AI is more like a perishable good, subject to evolution and demanding constant adaptation.
Thank you for reading this post, don't forget to subscribe to our AI NAVIGATOR!
Anything you build will be outdated in a few cycles – right now that’s months. Plus, everything in the cloud rots faster to begin with. If your organisation has never built and managed a digital product, you will struggle with the pace of Gen AI – even to just maintain the status quo.
***
To get Gen AI right, you need to invest not only in building a product but also a growing capability of people, skills, and tribal knowledge. Think of that investment as the cost of a gym membership. Embrace it as a lifestyle change. And don’t forget the basics of cloud engineering – that’s as big a mistake as skipping leg day at the gym.
Your Gen AI will obviously be powered by your data, but your data may not be ready. I’m not talking about sophisticated and neatly labelled training data. I mean fundamental business data: lists of customers, product catalogues, employee credentials, and so on. Clean, high-quality sources held on suitably performant platforms. Do not underestimate how important that is: the point was made and made again in every use case presented at the Summit.
Useful AI products require useful data products (or as close as you can get with your current architecture). If your data strategy is as basic as a reporting strategy on top of an application strategy, you will struggle to feed your AI capabilities.
Your data supply chain is the bottleneck that will limit your scaling. Try to first scale in an area with teams that care about providing good data directly from their functions – and help them do it better still. But there is a caveat.
The trap of scaling Low value
The path of least resistance often leads to the place of lowest value. Many Gen AI PoCs were done at the margins of their enterprises due to safety concerns and the organisational inertia that any intrapreneur knows too well. There is a real risk of scaling something that is not worth it.
For example, I see a lot of HR-related use cases: chatbots that help employees understand policies or book time off. Using your employees avoids the discomfort of subjecting your customers to the new technology.
But that is also exactly what every HR system SaaS provider is working on. Unless you are an HR system SaaS provider, you are not learning anything useful from that Gen AI experiment. Something closer to how you create value for your customers would be a better choice.
Do not entertain sideshows just because they tick the right tech boxes. If you cannot describe the customer-relevant value streams, you lack sufficient strategic situational awareness. And if you don’t know when you are winning, why are you playing?
Key take-aways
Mature customers stand to gain a lot by productising FMs with AWS Bedrock. However, realising that potential requires a technical focus on modern data strategy. And alongside, you need a clear connection between your enterprise strategy and real-world value streams to see which areas are worth pursuing. Without that focus and clarity, your Gen AI PoCs will remain in the toybox.
About the Author:
Chris Sherrington has 20 years of experience in Enterprise Architecture, Technology Strategy and Innovation Leadership, making new tech work with old tech.
Share this: