Key Responsibilities:
- Lead end-to-end solution architecture across data engineering, AI/ML workflows, and governance frameworks
- Define and implement reference architectures for large-scale distributed data platforms on AWS or Azure
- Design secure, scalable solutions using Databricks Lakehouse, Unity Catalog, and Delta Live Tables
- Integrate data pipelines across batch, streaming, and event-driven architectures (Kafka, Kinesis, Firehose, Fargate, etc.)
- Architect production-grade Gen AI or LLM-powered systems with agentic workflows where applicable Collaborate with stakeholders across security, DevOps, and analytics to enforce best practices
Required Experience:
- 8+ years in solution architecture, including at least 3 years on Databricks or comparable big data platforms
- Extensive experience designing cloud-native architectures on AWS or Azure
- Strong grounding in data lakehouse architecture, Delta Lake, SCIM, and access controls
- Proven background in architecting for regulated domains (aviation, finance, public sector, etc.)
- Hands-on experience with CI/CD for infrastructure and pipelines (Terraform, GitLab, Jenkins) Excellent communication and stakeholder management skills, with the ability to influence and drive architectural decisions
Preferred Qualifications:
- Databricks Architect or Professional certification
- Background in geospatial systems, predictive modeling, or ML Ops
- Familiarity with security frameworks in cloud environments