AI & AWS Data Engineer
Not sure if you're a good fit?
Upload your resume and TixelJobs AI will compare it against AI & AWS Data Engineer at InfoCepts. Get a match score, missing keywords, and improvement tips before you apply.
Free preview · Your resume stays private
About the Role
Position: AI & AWS Data Engineer
Location: Pune/Nagpur/Chennai/ Bangalore
Type of Employment: Full time
Purpose of the Position: This role is ideal for individuals who are passionate about applying their expertise in AI Engineering, Data Engineering, programming, data analysis, and problem-solving to real-world challenges. As a key member of our engineering team, you will support clients on their AI journey by developing solutions that drive measurable business value. This position plays a critical role in advancing Infocepts’ strategic goals by delivering data-driven insights and enabling transformative AI/ML programs.
Key Result Areas and Activities:
- Data & Cloud Engineering: Design and maintain high-performance ETL/ELT pipelines on AWS using PySpark, Python, SQL across EMR, Glue, Lambda, Step Functions, Redshift, and S3.
- Applied AI – LLM, RAG, Agentic Workflows: Develop, fine-tune, and deploy LLMs (GPT‑4 class, BERT, T5) for enterprise applications. Implement RAG using FAISS, Elasticsearch, LangChain, LlamaIndex, or similar frameworks for grounding, vector search, and knowledge retrieval.
.
- AI Platform & Tooling (Modern Stack): Work with emerging enterprise AI platforms including:
Snowflake Cortex for secure LLM inferencing, embeddings, and feature pipelines
Databricks Lakehouse AI / Databricks Genie for model training, inference, and agentic workloads
- Architecture, Governance & Collaboration: Contribute to solution architecture, AI governance, data security, and model lifecycle frameworks.
Work and Technical Experience
Required Qualifications
- 3+ years of experience across AI/ML and Data Engineering.
- Expert-level skills in Python, PySpark, Scala and advanced SQL.
- Hands-on experience with AWS data and AI services.
- Working experience with LLMs, RAG, vector databases, and transformer-based architectures.
- Solid understanding of distributed systems, data modeling, and data platform design.
- Experience with modern AI frameworks such as LangChain, FAISS, or Elasticsearch.
- Experience with Productivity Improvement Tools like Github Copilot
Preferred / Nice-to-Have
- Good functional knowledge of Media & Entertainment domain
- Experience with Snowflake Cortex for secure enterprise LLM workloads.
- Experience with Databricks Genie / Lakehouse AI, Unity Catalog, Delta Live Tables.
- Knowledge of prompt engineering, safety guardrails, and agent tooling.
- Familiarity with feature stores, model registries, MLflow, or similar tooling.
- Knowledge of statistical methods, deep learning, NLP techniques.
Qualifications
- Bachelor’s degree in computer science engineering, or related field (master’s degree is a plus)
- Demonstrated continued learning through one or more technical certifications or related methods
Qualities
- Strong collaboration across cross-functional and cross-time-zone teams.
- Persuasive communicator with strong presentation and storytelling skills.
Pune, Maharashtra, India
3 to 7 years
Ready to apply?
This job is active. Apply now to get in early.