Generative AI in Production
Course 1485
1 DAY COURSE
Course Outline
Traditional MLOps is a set of practices to productionize traditional ML systems for enterprise applications. Generative AI raises new challenges in managing and productionizing applications at scale. The field of generative AI operations seeks to address these new challenges. In this course, you learn about the challenges that arise when deploying and productionizing generative AI-powered applications. You learn how to secure your generative AI-powered applications. Finally, you will discuss best practices for logging and monitoring your generative AI-powered applications in production.
Generative AI in Production Benefits
-
This course will empower you to:
- Understand the challenges in productionizing applications using generative AI
- Manage experimentation and evaluation for LLM-powered applications
- Productionize LLM-powered applications
- Secure generative AI applications
- Implement logging and monitoring for LLM-powered applications
-
Prerequisites
Completion of the "Application Development with LLMs on Google Cloud" or equivalent knowledge
Who Should Attend
Developers and DevOps Engineers who wish to operationalize GenAI-based applications
Generative AI in Production Course Outline
Learning Objectives
Module 1: Introduction to Generative AI in Production
- Understand generative AI operations
- Compare traditional MLOps and GenAIOps
- Analyze the components of an LLM system
- Define and compare RAG and ReAct
Module 2: Generative AI Application Deployment
- Evaluate application deployment options
- Deploy, package, and version apps
Module 3: Productionizing Generative AI
- Maintain and update LLM models
- Test and evaluate gen AI-powered apps
- Deploy CI/CD pipelines for gen AI-powered apps
Module 4: Logging and Monitoring for Production LLM Systems
- Utilize Cloud Logging
- Version, evaluate, and generalize prompts
- Monitor for evaluation-serving skew
- Utilize continuous validation.
Module 4: Securing Generative AI Applications
- Identify security challenges for gen AI applications
- Understand prompt security issues
- Apply sensitive data protection and DLP API
- Implement Model Armor
Module 5: Observability for Production LLM Systems
- Describe the purpose and capabilities of Google Cloud Observability
- Explain the purpose of Cloud Monitoring
- Explain the purpose of Cloud Logging
- Explain the purpose of Cloud Trace
Private Team Training
Interested in this course for your team? Please complete and submit the form below and we will contact you to discuss your needs and budget.
- choosing a selection results in a full page refresh