Associate Data Engineer
Not sure if you're a good fit?
Upload your resume and TixelJobs AI will compare it against Associate Data Engineer at Nice. Get a match score, missing keywords, and improvement tips before you apply.
Free preview · Your resume stays private
About the Role
At NiCE, we don’t limit our challenges. We challenge our limits. Always. We’re ambitious. We’re game changers. And we play to win. We set the highest standards and execute beyond them. And if you’re like us, we can offer you the ultimate career opportunity that will light a fire within you.
So, what’s the role all about?
Associate Data Engineer at NICE CXone sits between the teams that build data products and the infrastructure that runs them. We keep Snowflake environments healthy, get changes deployed safely through maintenance windows, and make sure the pipelines and systems our partners depend on stay observable and reliable.
Your time will be split between operational work — coordinating and executing change requests, monitoring systems, supporting deployments — and project work focused on improving how we operate: better automation, cleaner alerting, more resilient processes.
How will you make an impact?
Release & deployment operations
- Coordinate and execute build and release activities across multiple production regions within scheduled maintenance windows
- Manage day-to-day operations of release pipelines, including Jenkins and GitHub Actions workflows, on behalf of data engineering teams
- Author and manage Change Requests in ServiceNow, following CAB and CHG guidelines
- Respond to platform alerts, investigate issues, and keep stakeholders informed
Observability & support
- Perform ongoing monitoring and maintenance of Snowflake environments — warehouse performance, access, and cost
- Monitor MSK (Kafka) consumers and producers via CloudWatch; escalate or resolve anomalies
- Keep runbooks current and contribute to post-incident reviews
- Support data engineering teams with deployment questions and environment issues
Improvement & automation
- Identify and prototype improvements to DataOps processes through scripting and automation
- Build and maintain tooling that reduces toil and improves the reliability of our release pipeline
Day-to-Day Tech
- Snowflake
- AWS (S3, IAM, CloudWatch, Lambda)
- Jenkins · GitHub Actions
- MSK / Kafka
- Apache Airflow
- Python
- ServiceNow
- Jira
Have you got what it takes?
- 2+ years in a DataOps, Platform Operations, or DevOps/SRE role supporting data infrastructure
- Snowflake: hands-on with RBAC, warehouses, multi-environment administration, and SQL
- AWS: S3, IAM, CloudWatch, Lambda — comfortable in the console and with the CLI
- Experience defining and working within CI/CD pipelines using Jenkins and/or GitHub Actions
- Python scripting for operational automation
- Experience working within a formal change management process: CRs, maintenance windows, approvals
- Clear communicator when things break — you keep stakeholders informed without being asked
- Comfortable working async across time zones in a distributed team
- Familiar with AI tools and comfortable using them in day-to-day work
Nice to Have
- Apache Airflow — DAG monitoring and operational support
- MSK / Kafka operational experience
- I
Ready to apply?
This job is active. Apply now to get in early.