DevOps/Linux Engineer

Unlock Employer New York, NY Full Time Live
The most exciting part is the enormous potential for personal and professional growth. We are always seeking new and better tools to help us meet challenges such as adopting proven open-source technologies to make our data infrastructure more nimble, scalable and robust. Some of the cutting edge technologies we have recently implemented are Kafka, Spark Streaming, Docker and Mesos. As a DevOps Engineer, you'll support data infrastructure services, like Hadoop, RDBMS, Vertica, Kafka etc from installation and configuration to maintenance and curation. You will work closely with analysts, data scientists, and developers to make data processing transparent and provide data services that help drive and support business goals. Team Responsibilities: Installation, upkeep, maintenance and monitoring of Kafka, Hadoop, Vertica, RDBMS Ingest, validate and process internal & third party data Create, maintain and monitor data flows in Hive, SQL and Vertica for consistency, accuracy and lag time Develop, maintain a framework for jobs (primarily aggregate jobs in Hive) Create different consumers for data in Kafka such as camus for Hadoop, flume for Vertica and Spark Streaming for near time aggregation Train Developers/Analysts on tools to pull data Tool evaluation/selection/implementation Backups/Retention/High Availability/Capacity Planning Disaster Recovery- We have all our core data services in another Data Center for complete business continuity Review/Approval - DDL for database, Hive Framework jobs and Spark Streaming to make sure it follows our standards 24*7 On call rotation for Production support Technologies We Use: Chronos - for job scheduling Docker - Packaged container image with all dependencies Graphite/Beacon - for monitoring data flows Hive - SQL data warehouse layer for data in HDFS Impala- faster SQL layer on top of Hive Kafka- distributed commit log storage Marathon – cluster wide init for Docker Containers Mesos - Distributed cluster resource manager Spark Streaming - Near time aggregation SQL Server - Reliable OLTP RDBMS Sqoop - Import/Export data to RDBMS Vertica - fast parallel data warehouse Required Skills: 3+ Years experience supporting critical applications on Linux operating system preferably Red Hat, including OS installation and upgrade, package management, volume management, security auditing, and performance tuning including 2+ years with configuration management and containerization tools. Proficiency in at least one scripting language ( e.g. python, perl, ruby, scala) Familiarity with RDBMS, SQL; Proficiency with configuration management tools such as Ansible, Puppet or Chef Experience with Docker and Mesos/kubernetes is a huge plus. Passion for engineering and computer science around data - You must be self–driven, inquisitive and hungry to learn and improve. Willingness to participate in 24x7 on-call rotation read more

Required Skills

My Compatibility Score

Choose Match Score option: