Job Description

Reference # : 21-01048Title : Big Data Developer
Location : Tampa, FL
Position Type : Contract
Experience Level : Start Date / End Date : 08/09/2021 / 12/31/2022
Description
Our client, a leading global financial services company, has approximately 200 million customer accounts and does business in more than 140 countries. They provide consumers, corporations, governments and institutions with financial products and services, including consumer banking and credit, corporate and investment banking, securities brokerage, transaction services, and wealth management.

Job Purpose: This position will require the candidate to hands-on in designing, building and implementing robust & scalable applications that can fulfil the business requirements. The role will require the candidate to create and develop the technical design solution that can fulfil technical and business requirements.

Job Background/context:
-5 to 8 years of Application development experience through full lifecycle
-Prior ETL development experience in building data warehouses and data pipelines.
-The current team structure is relatively flat which enables the organization to rapidly deploy the most suitable team to meet the specific client needs. This means that the successful candidates will need to be flexible, multi-tasking and able to excel in a constantly changing environment.
-Commanding knowledge and hands on experience in Hadoop, Hive, Spark, Impala, Sqoop and other technologies in Cloudera's CDH distribution.
-Prior experience on building solutions & reusable components on Big Data platforms.
-Prior experience on designing and developing data ingestion process, Data Quality and Recon rules.
-Strong knowledge on performance tuning in Hadoop eco systems
-Experience in defining architecture and technical design for use cases in Hadoop eco systems
-Experience with Red Hat Linux and UNIX Bash Shell Scripting.
-Strong knowledge and experience of Python and PySpark is preferred
-Strong knowledge and experience of on software release lifecycle management tools like Bit-bucket, Jenkins etc

Key Responsibilities:
-Design, develop and implement robust and scalable data pipelines on Big Data technology stack.
-Perform defect analysis and support offshore development teams in resolving defects/production issues.
-Perform platform upgrades across SDLC environments from an application standpoint.
-Coordinate and work with Platform engineering team, support teams and architects to triage technical issues and identify resolutions.
-Triage production issues and ensure BAU.
-Proactively notify stakeholders of risks, bottlenecks, problems, issues, and concerns.
-Compliance with Major Global Bank's System Development Lifecycle and Information Security requirements.
-Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation
-Design, development and enhancement in existing framework.
-Technical brainstorming on best approaches to tackle changes in the system
-Contribute ideas to the evolution of the system architecture
-Contribute ideas to the refinement of the team development tools and processes
-Managing time and changing priorities in a dynamic environment
-Ability to provide quick turnaround to software issues and management requests
-Ability to assimilate key issues and concepts and come up to speed quickly
-Develop prototypes and proof of concepts for the selected solutions

Development Value:
-Opportunity to work on strategic AML Monitoring transformation initiatives.
-Exposure and opportunity to work in innovative technologies on Big data and Hadoop echo systems.
-Contribute to projects involving complex feature-based data algorithms and machine learning.
-Exposure to AML monitoring processes and functional knowledge.
-A team with a win-together and strong sense of identity and positive culture.

Skills:
Mandatory Skills:
-Strong experience in Hadoop, Hive, SQL, Spark with solid understanding of ETL/Data Pipelines
-Strong knowledge/experience in Python, PySpark preferred
-Good Experience in Unix Shell Scripting
-Autosys job scheduler
-Architecture Design
Desirable Skills:
-Machine Learning
-AML domain knowledge

Other skills & Qualifications:
-Bachelor's degree (in science, computers, information technology or engineering)
-Team player collaborating with the team members to solve complex problems and deliver best in class solutions as a team
-Ability to articulate thoughts and ideas effectively within the team.
-Strong influencing & interpersonal skills.

Competencies
1. Proven ability to work with teams that are geographically separated
2. Should be a self-starter with a high level of initiative.
3. Excellent oral and written communication skill as candidate is expected to work with onshore & near-shore teams

153288

Please see our complete list of jobs at:
www.rmscorp.com

Application Instructions

Please click on the link below to apply for this position. A new window will open and direct you to apply at our corporate careers page. We look forward to hearing from you!

Apply Online