Job Description

Reference # : 20-00590Title : Technical Lead/Manager-Big Data-Spark/Hive/Impala/Kafka - Investment Bank
Location : Jersey City, NJ
Position Type : Contract
Experience Level : Start Date / End Date : 06/01/2020 / 05/31/2021
Our client, a leading global financial services company, has approximately 200 million customer accounts and does business in more than 140 countries. They provide consumers, corporations, governments and institutions with financial products and services, including consumer banking and credit, corporate and investment banking, securities brokerage, transaction services, and wealth management.


Capital Markets Data team is building the next generation Data fabric to solve for Business, Analytics and growing regulatory needs. Vast amounts of data assets have been accumulated through the years. Data fabric built on emerging technologies will facilitate the data being inspected, cleansed, transformed for support decision-making.The candidate must possess a combination of skills/traits around delivery oversight, driving technology innovation to production, domain knowledge, building and managing high performance teams, peer networking, management reporting and aligning the organization to ISG goals, strategies.

Key Responsibilities:
- Work with core team and deliver solutions using Impala/Hive, Parquet, Kafka and related Big Data Technologies
- Requirement gathering and understanding, Analyze and convert functional requirements into concrete technical tasks and able to provide reasonable effort estimates
- Responsible for end-to-end project delivery within schedule and with required level of quality.
- Reporting on all projects to senior management and cross-functional key stakeholders
- Coordinate the management of cross-function interdependencies and lead on the execution of communication plans to all key stakeholders
- Work proactively with global teams to address project requirements, and articulate issues/challenges with enough lead time to address project delivery risks
- Providing expertise in technical analysis and solving technical issues during project delivery
- Code reviews, test case reviews and ensure code developed meets the requirements
- Responsible for systems analysis, architecture, Design, Coding, Unit Testing and other SDLC activities
Must Have Experience:
- 10-15 years' relevant experience in Technology development and delivering projects
- Should have been involved in enterprise scale multi region project development and tracking initiatives
- Experience in banking/capital markets or Risk or Finance is necessary.
- Experience in leading large Big Data Development Program on Cloudera Platform and having Hands on Experience in different tech stacks in big data including Spark(on Scala and Java), Hive, and Impala etc.
- Should have strong experience in Relational and No-SQL databases
- Experience of development methodologies such as SDLC, Agile key mile stones/ artifacts around same, size estimation, structure of BRD/FRD and full blown project plan preparation, test methodologies ( functional, regression, performance).
- Overview programming paradigms- object oriented, functional etc; should have driven a large build out, implemented end to end one to two program end to end in global model.
- Complete project lifecycle exposure
- Exposure/ experience in enterprise level platform development
- Ability to manage high performance teams in high pressure delivery environment
- Experience in systems analysis and programming of software applications
- Experience in managing and implementing successful projects
- Ability to work under pressure and manage deadlines or unexpected changes in expectations
- Graduate degree in Computer Science, Information Systems or equivalent quantitative field


Please see our complete list of jobs at:

Application Instructions

Please click on the link below to apply for this position. A new window will open and direct you to apply at our corporate careers page. We look forward to hearing from you!

Apply Online