We are looking for DevOps Engineer who will be responsible for managing multi-cloud and on-premises environment and deployment using for DQLabs platform. Key skill is the ability to learn, understand new technologies and adopt best practices and create a process plus standardize deployment.
Location: Los Angeles, CA
Skills: AWS/Azure/GCP, React, Databricks, Spark, Python/Java/Scala, SQL, Hadoop, NodeJS, Django
• Develop and deliver creative solutions that automate systems engineer functions, reducing manual and unplanned effort.
• Analyze, design, implement and validate strategies for streamlined CI/CD workflows.
• Design and implement automated dynamic environments to support the needs of delivery teams.
• Apply principles of best-practice, self-organization, autonomy, and continuous improvement to self and team.
• Participate in supporting the data platforms 24X7.
• 3+ years of hands on experience with enterprise scale applications and systems with AWS/Azure/GCP cloud and associated services
• Expertise in Big Data technologies, Cloud Data warehouse , Hadoop ecosystem and in particular with Apache Spark, Databricks.
• Expertise in programming languages Python and expertise in Java, Scala or similar will help.
• Ability to work in fast paced, high pressure, agile environment and willingness to learn any new technologies and apply them at work in order to stay ahead of the curve.
• Expertise in building and managing large volume data processing (both streaming and batch) platform and performance optimization, clustering is a must.
• Experience with version control software’s like GIT and CI/CD testing/automation experience.
• Excellent one-on-one communication and presentation skills, specifically able to convey technical information in a clear and unambiguous manner.
• Working knowledge of Linux and Windows operating system.