09th May, 2025
For more than three decades, Strategic Data Systems (SDS) has been a software consultancy firm specializing in strategy, technology, and business transformation for Fortune 100 companies, mid-sized firms, and startups. At SDS, we empower our development teams to address our clients’ critical business challenges by leveraging cutting edge technologies. If you seek a workplace where your contributions are truly appreciated, then SDS is the company for you. Join us today to work alongside fellow development specialists and become a crucial part of our dynamic and cohesive community.
What You’ll Do
TECHNICAL SKILLS
Must Have
- AWS services- Bedrock, SageMaker, ECS and Lambda
- Demonstrated contributions to open-source AI/ML/Cloud projects
- Demonstrated proficiency in Python and Golang coding languages
- Experience implementing RAG architectures and using frameworks and ML tooling like: Transformers, PyTorch, TensorFlow, and LangChain
- LLM
- Ph.D. in AI/ML/Data Science
Nice To Have
- Demonstrated experience with AWS organizations and policy guardrails (SCP, AWS Config)
- FinOps
We are hiring a Distinguished Cloud AI Software Engineer who has actually built AI/ML applications—not just read about them. You will operate as a trusted advisor in a hands-on capacity for the development of retrieval-augmented generation (RAG) systems, fine-tuning LLMs, and AWS-native microservices that drive automation, insight, and governance in an enterprise environment. You’ll design and deliver scalable, secure services that bring large language models into real operational use—connecting them to live infrastructure data, internal documentation, and system telemetry.
You’ll be part of a high-impact team pushing the boundaries of cloud-native AI in a real-world enterprise setting. This is not a prompt-engineering sandbox or a resume keyword trap. If you’ve merely dabbled in SageMaker, mentioned RAG on LinkedIn, or read about vector search—this isn’t the right fit. We’re looking for candidates who have architected, developed, and supported AI/ML services in production environments.
This is a builder’s role within our Public Cloud AWS Engineering team. We aren’t hiring buzzword lists or conference attendees. If you’ve built something you’re proud of—especially if it involved real infrastructure, real data, and real users—we’d love to talk. If you’re still learning, that’s great too—but this isn’t an entry-level role or a theory-only position.
DUTIES AND RESPONSIBILITIES:
- Design, develop, and maintain modular AI services on AWS using Lambda, SageMaker, Bedrock, S3, and related components—built for scale, governance, and cost-efficiency.
- Lead the end-to-end development of RAG pipelines that connect internal datasets (e.g., logs, S3 docs, structured records) to inference endpoints using vector embeddings.
- Design and fine-tune LLM-based applications, including Retrieval-Augmented Generation (RAG) using LangChain and other frameworks.
- Tune retrieval performance using semantic search techniques, proper metadata handling, and prompt injection patterns.
- Collaborate with internal stakeholders to understand business goals and translate them into secure, scalable AI systems.
- Own the software release lifecycle, including CI/CD pipelines, GitHub-based SDLC, and infrastructure as code (Terraform).
- Support the development and evolution of reusable platform components for AI/ML operations.
- Create and maintain technical documentation for the team to reference and share with our internal customers.
- Excellent verbal and written communication skills in English.
SUPERVISORY RESPONSIBILITIES: None
REQUIRED KNOWLEDGE, SKILLS, AND ABILITIES:
- 10+ years of proven software engineering experience with a strong focus on Python and GoLang and/or Node.js.
- Demonstrated contributions to open-source AI/ML/Cloud projects, with either merged pull requests or public repos showing real usage (forks, stars, or clones).
- Direct, hands-on development of RAG, semantic search, or LLM-augmented applications, using frameworks and ML tooling like Transformers, PyTorch, TensorFlow, and LangChain—not just experimentation in a notebook.
- Ph.D. in AI/ML/Data Science and/or named inventor on pending or granted patents in machine learning or artificial intelligence.
- Deep expertise with AWS services, especially Bedrock, SageMaker, ECS, and Lambda.
- Proven experience fine-tuning large language models, building datasets, and deploying ML models to production.
- Demonstrated success delivering production-ready software with release pipeline integration.
NICE-TO-HAVES:
- Policy as Code development (i.e., Terraform Sentinel) to manage and automate cloud policies, ensuring compliance
- Experience optimizing cost-performance in AI systems (FinOps mindset).
- Awareness of data privacy and compliance best practices (e.g., PII handling, secure model deployment).
- Demonstrated experience with AWS organizations and policy guardrails (SCP, AWS Config).
What You’ll Get SDS, Inc. provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, gender, sexual orientation, national origin, age, disability, genetic information, marital status, amnesty, or status as a covered veteran in accordance with applicable federal, state, and local laws.
- Competitive base salary
- Medical, dental, and vision insurance coverage
- Optional life and disability insurance provided
- 401(k) with a company match and optional profit sharing
- Paid vacation time
- Paid Bench time
- Training allowance offering
- You’ll be eligible to earn referral bonuses!
Apply For Job