Senior Manager – Big Data Engineering

New York Full Time Live

Required Skills

My Compatibility Score

Choose Match Score option:

big data ETL hadoop Java JavaScript Jenkins maven nosql Perl Python Shell Unix agile Apache BASIC CLOUD DB2 HBase Hive JS jUnit lead Linux management MapReduce Nose Pig scripting solr Spark Spring TeraData testing UI UX Warehousing xUnit YARN
show more
Automatch with LinkedIn
Do you want to be on the forefront of the next BIG thing in data? Do you love designing and implementing business critical data-driven products and data management solutions? Do you enjoy solving complex business problems in a fast-paced, collaborative, and iterative delivery environment? If this excites you, then keep reading.

Help us use technology to bring ingenuity, simplicity, and humanity to banking as a Sr. Manager Delivery Lead in our Enterprise Data Services department supporting our Big Data & Streaming Technology Greenhouse. It’s all in the name: Deliver. In this role, you will be responsible for taking a hands-on approach to building and leading multiple teams to define and DELIVER against a robust technology roadmap for implementing customized data-driven products within our Big Data Ecosystem. The right candidate for this role is someone who is passionate about their craft and understands the full technology stack involved in building data-driven products (back end data, APIs and some UI/UX). We’re looking for someone who welcomes challenges and is hyper-focused on delivering exceptional results to internal business customers while creating a rewarding team environment.

Developing cutting-edge data technology solutions requires dynamic collaboration of bright, talented people, and forward-thinking leadership. Come see what makes us one of the “100 Best Companies to Work For”.

Job Expectations:
- Build and lead cross-functional Agile teams to create and enhance software that enables state of the art, business critical data management solutions
- Motivate teams to solve problems and deliver high quality results in a fast-paced, collaborative environment that’s geographically decentralized
- Define, establish, and execute against strategic delivery roadmaps, actively managing technology risks, issues and impediments
- Partner with customers and stakeholders to understand and clarify requirements to creatively solve complex business problems
- Design and develop robust, scalable, and iterative data management solutions leveraging Big Data/Java and open source technologies.
- Transition traditional ETL/Data Warehousing solutions to solutions which leverage Big Data technologies hosted either internally or in the cloud
- Research and explore new technology patterns and frameworks when existing enterprise frameworks are not sufficient to meet customer needs
- Utilize various tools & frameworks that enable capabilities within the Hadoop ecosystem (MapReduce, YARN, Pig, Hive, Hbase, Sqoop), and related Apache projects (e.g. Spark and Flink), and NoSQL
- Develop and implement data-enabling software utilizing open source frameworks or projects such as Spring, Angular JS, SOLR, and Drools
- Leverage development frameworks such as continuous integration and test driven development to enable the rapid delivery of working code
- Strong verbal and written communication skills are required due to the dynamic nature of discussions with business customers, vendors, and other engineering and product teams
- Respect the need for providing end-to-end governance, drive the team to adhere to existing processes, and define new processes and procedures when none exist


Your experience probably includes a few of these:
- Ability to handle multiple responsibilities in an unstructured environment where you’re empowered to make a difference. In that context, you will be expected to research and develop cutting edge technologies to accomplish your goals
- Proven track record of end-to-end implementation of top notch data management solutions leveraging both Big Data (Hadoop Ecosystem) and traditional data warehousing technologies (Ab Initio, Teradata, DB2)
- Experience delivering front-end and back-end solutions written in Java
- Proficiency with scripting languages (Python, Perl, JavaScript, Shell)
- Working with automated build and continuous integration systems (Maven, Jenkins) and test-driven development and unit testing frameworks (jUnit, xUnit, Nose)
- UNIX/Linux skills, including basic commands, shell scripting, and system administration/configuration
- Data mining, machine learning, statistical modeling tools or underlying algorithms

Your interests:
- You geek out over obscure sports statistics. You ponder what drove your streaming music service’s algorithms to accurately predict that you’d like the song you’re listening to. Nate Silver asks you who’s going to win the next election. You love data
- You get a thrill out of using large data sets, some slightly messy, to answer real-world questions
- You yearn to be a part of cutting-edge, high profile projects and are motivated by delivering world-class solutions on an aggressive schedule
- You love building and leading teams to deliver world class solutions in a fun and rewarding environment
- You are passionate about learning new technologies and mentoring more junior resources
- Humor and fun are a natural part of your flow

Basic Qualifications:
- Bachelor’s Degree or military experience
- At least 5 years of work experience delivering data-driven technology products and solutions
- At least 2 years of work experience delivering big data solutions
- At least 3 years of work experience delivering data-driven technology products and solutions within an Agile delivery environment
- At least 3 years of experience in People Management

Preferred Qualifications:
- Master’s Degree
- 3+ years experience with the Hadoop stack
- 3+ years experience with NOSQL databases
- 5+ years experience delivering Java based software solutions read more