1 week ago Be among the first 25 applicantsWe are seeking a highly skilled remote Senior Data DevOps Engineer to join our team, working on a cutting-edge project that involves the development and deployment of large-scale data processing pipelines.In this position, you will play a critical role in designing, implementing, and maintaining the infrastructure that enables data processing, storage, and analysis.
You will work with a team of experienced professionals, tackling complex challenges and driving innovation in the field of data engineering.
If you are passionate about DevOps and have a solid understanding of data processing technologies, we invite you to apply for this exciting opportunity.ResponsibilitiesDesign, implement, and maintain data processing pipelines using technologiesDevelop and maintain CI / CD pipelines for data processing applications, ensuring efficient and reliable deploymentImplement and manage containerization technologies to enable scalable and flexible infrastructureCollaborate with data scientists and analysts to design and implement data storage and retrieval solutionsEnsure the security and availability of data processing infrastructure, implementing best practices for data protection and disaster recoveryMonitor and troubleshoot data processing pipelines and infrastructure, identifying and resolving issues in a timely mannerContinuously improve data processing infrastructure, staying up-to-date with the latest technologies and industry trendsRequirementsA minimum of 3 years of experience in DevOps, with a focus on data engineering and infrastructure managementExpertise in CI / CD processes and tools, including Git, Jenkins, and TeamCityHands-on experience with containerization technologies such as Docker and Kubernetes, as well as container orchestration tools like Amazon ECS or KubernetesIn-depth knowledge of Amazon Web Services (AWS), including EC2, S3, and LambdaStrong proficiency in Linux system administration and shell scriptingExperience with infrastructure as code tools such as Terraform, Ansible, or CloudFormationFamiliarity with the Elastic Stack (Elasticsearch, Logstash, and Kibana) for log management and analysisExcellent communication and collaboration skills, with the ability to work effectively in a team environmentFluent spoken and written English at an Upper-Intermediate level or higher (B2+)Nice to haveExperience with other cloud providers such as Google Cloud Platform or Microsoft AzureExperience with Big Data technologies such as Hadoop, Hive, and PigFamiliarity with configuration management tools such as Chef or PuppetKnowledge of scripting languages such as Python or RubyWe offerInternational projects with top brandsWork with global teams of highly skilled, diverse peersEmployee financial programsPaid time off and sick leaveUpskilling, reskilling and certification coursesUnlimited access to the LinkedIn Learning library and 22,000+ coursesGlobal career opportunitiesVolunteer and community involvement opportunitiesEPAM Employee GroupsAward-winning culture recognized by Glassdoor, Newsweek and LinkedInSeniority levelMid-Senior levelEmployment typeFull-timeJob functionEngineering, Information Technology, and Business DevelopmentIndustriesSoftware Development, IT Services and IT Consulting, and RetailReferrals may increase your chances of interviewing at EPAM Systems.Get notified about new Senior Data Specialist jobs in Brazil.Data Migration Manager (Implementation Specialist), Enterprise and StrategicWe're unlocking community knowledge in a new way.
Experts add insights directly into each article, started with the help of AI.
#J-18808-Ljbffr
Machine Learning • Brasília, Distrito Federal, Brasil