Job Title: AI Implementation Engineer
Location: Bengaluru KA India
Experience Level: 4–5 years
Department: Professional Services & Support
Employment Type: Full-time
About the Role:
We are seeking a skilled and motivated AI Implementation Engineer with hands-on experience in Amazon Bedrock and generative AI services. As part of our AI Solutions team, you will play a critical role in designing, deploying, and optimizing AI solutions leveraging Bedrock and other AWS services. You will collaborate with cross-functional teams including implementation engineers, solution architects, support engineers etc. to deliver high-impact AI integrations for clients or internal use cases.Key Responsibilities:
- AI/ML Solution Implementation:
Design and implement generative AI applications using Amazon Bedrock and foundation models like Anthropic Claude, Mistral, Cohere, and others.
Integrate Bedrock with other AWS services (e.g., Lambda, S3, API Gateway, Step Functions, SageMaker).
Develop and manage prompts, chaining logic, and response parsing for AI applications.
- Client Delivery & Enablement:
Collaborate with product and customer teams to gather requirements and define AI-driven solution architectures.
Support PoCs, pilots, and production-grade deployments using AWS-native services.
Educate stakeholders on Bedrock’s capabilities and best practices for GenAI integration.
- Infrastructure & Deployment:
Build and deploy scalable and secure AI pipelines using IaC tools (e.g., CloudFormation, Terraform).
Ensure proper monitoring, logging, and alerting for deployed models and endpoints.
Optimize AI service costs and performance.
- Security & Governance:
Implement fine-grained access controls using IAM roles and policies.
Ensure compliance with data privacy, security standards, and model governance frameworks.Qualifications:Must-Have:
- 3–5 years of experience in AI/ML solution implementation or cloud engineering roles.
- Strong hands-on experience with Amazon Bedrock and deploying foundation models via APIs.
- Proficiency with Python, JSON, REST APIs, and prompt engineering techniques.
- Familiarity with AWS core services (e.g., Lambda, S3, API Gateway, IAM).
- Experience deploying applications using CI/CD pipelines and IaC tools.
- Strong problem-solving and communication skills.
Nice-to-Have:
- Experience with other GenAI platforms (e.g., OpenAI, Azure OpenAI, LangChain, RAG frameworks).
- Knowledge of vector databases (e.g., Amazon Kendra, Pinecone, OpenSearch with vector search).
- Background in NLP, transformers, or LLM fine-tuning (even at a basic level).
- AWS certifications (e.g., Solutions Architect Associate, Machine Learning Specialty).