We provide IT Staff Augmentation Services!

Senior Consultant Resume

0/5 (Submit Your Rating)

SUMMARY:

  • Big Data Senior Consultant with overall 11+ years IT experience in Hadoop, Spark, AWS and Data warehousing technologies. Strong knowledge on banking & finance domain particularly AML & capital markets.

TECHNICAL SKILLS:

Big Data: Hadoop (HDFS, Map Reduce), Cloudera, Hortonworks2.4/2.5, Spark, Kafka, Storm, Flume, Pig, Hive, Elasticsearch, HBASE, Sqoop, Zookeeper, Phoenix, Pig/Hive Custom UDF.

Languages: Scala, SQL, Python, Pig Latin.

Scripting Languages: UNIX shell script.

Databases: DB2, Oracle, MySQL.

Ingestion/ UI Tools: Kibana, Attivio, Apache NiFi, Platfora, Attunity,Hue, Ambari.

AWS: EMR, EC2, S3, IAM, RDS, Dynamo DB, Aurora, Kinesis, SQS

SCM tools: Ambari, Github, JIRA, SQL Developer, Control - M.

IDE: Eclipse, I Python Notebook

ETL Tools: Ab Initio GDE (1.15/3.1), Ab initio Co-op (2.15/3.1)

Mainframe Skills: COBOL, JCL, VSAM

Domain: Banking & Finance, AML, Risk & Customer Analytics

PROFESSIONAL EXPERIENCE:

Senior Consultant

Confidential

Responsibilities:

  • Provide technical subject matter expertise in Big Data Architecture and Data Management technologies
  • Architect scalable big data architecture, data processing and analytics solutions
  • Work closely with the customer and the solutions architect to translate the requirements into a Big Data solution.
  • Design/Develop map reduce programs using Hive QL,Pig and shell scripts.
  • Design/Develop POC solutions using Elastic search and Kibana.
  • Design/Develop POC solutions using python.
  • Design/Develop Cloud solutions using Amazon AWS.
  • Design/Develop advanced hive scripts using various analytic functions.
  • Hands on experience in Apache Spark and Scala programming.
  • Provide solutions for optimization of the existing process
  • Support the execution of Attivio components - Hive connectors,Case object builder and validations
  • Co-ordinate with business and upstream/downstream teams during the SIT and UAT phases.
  • Unit Testing, Documentation & to support people.
  • Experience in working agile projects.

Senior Consultant

Confidential

Responsibilities:

  • Discuss with Data Scientists and create the systems requirements for this POC
  • Prepare the technical design document.
  • Create the and test datasets using python data analysis libraries.
  • Develop and execute the ML models.
  • Compute the ML metrics and evaluate the performance of the Machine learning models.
  • Provide and required documentation to the offshore resources.

Case Manager Senior Consultant

Confidential

Responsibilities:

  • Interface with AML SME and create the systems requirements for the casemanager tool
  • Prepare the technical design document for casemanager tool.
  • Develop and unit test the data transformation logic in Spark-Scala.
  • Develop the Spark logic to write the data frames into Elasticsearch.
  • Create the elasticsearch index’s and vizboards
  • Load and analyze streaming data using Amazon Kinesis.
  • Develop the dashboards in Kibana UI.
  • Create the EMR cluster and install the required tools.
  • Set the required users, group, roles and policies using AWS -IAM.

Senior Consultant

Confidential

Responsibilities:

  • Provide the implementation support in Microsoft Azure.
  • Analyze the system parameters and fine-tune it.
  • Create the design options in spark to enhance the tool.
  • Migrate the solution into AWS using custom ami.
  • Enhance the existing solutions using Spark.

Senior Consultant

Confidential

Responsibilities:

  • Ability to interface with business team to understand the requirements and conceptualize & explain analytical solutions.
  • Co-ordinate with offshore during the development.
  • Develop advanced Hive scripts using various analytic functions.
  • Optimize the hive and pig scripts
  • Develop Python scripts for text parsing/mining
  • Develop the vizualization charts/graphs using platfora
  • Co-ordinate with business and upstream teams during the SIT and UAT phases.
  • Unit testing, documentation & to support people.

Senior Hadoop Developer

Confidential

Responsibilities:

  • Creating ETL Process to move data from Source systems to Hadoop.
  • Create map reduce code to convert the source file in EBCDIC format to ASCII.
  • Create Data quality framework to do the basic validation of the data from source.
  • Create the Key and Split framework for adding the key columns and splitting the npi/non npi data’s
  • Experience in transforming and analyzing data using Hive QL and Pig Latin.
  • Experience in developing custom UDF/UDAF,Handling updates in hive and Apache Sentry
  • Experience in optimizing Hive queries and performance tuning
  • Experience in various process like Historical loads (one-time) from Teradata using an unload script built on Teradata Parallel Transporter (TPT) and Fast Load Scripts.
  • Registration of the datasets in a metadata registry that controls admittance into Hadoop
  • Good understanding of Hadoop Data classification and Directory Structure options.
  • In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node, MRv1 and MRv2 (YARN).
  • Provided solutions for different Integration and user acceptance testing issues. Coordinating with offshore team and provide them analysis and guidance.
  • Ensured the timely completion of the Unit and Integration Testing, testing effort on the project by coordinating with Business SMEs / IT, interface teams and stake holders.
  • Involved in daily SCRUM meetings to discuss the development/progress of Sprints and was active in making scrum meetings more productive.

Technical Lead

Confidential

Responsibilities:

  • Development of Extraction programs that uses ALE or File Analyzing the requirements and converting them to proper ETL design and then develop then using Ab Initio graphs.
  • Development of UNIX wrapper korn shell scripts.
  • Promotion of project from Development to SIT and UAT to Production.
  • Monitor the jobs using Control/m GUI and work very close with production support team to ensure the successful completion of jobs.
  • Maintaining the versions of project in EME.
  • Working in coordination with other teams like Java, Load testing, QA in different environments and System Integration Testing (SIT) and User Acceptance Testing (UAT).
  • Graph optimization from time to time and resolve the performance issues of the graph and explore the tools to utilize them to the optimum level.
  • Maintaining proper documentation of the project like Design Documents and explaining the documents to other teams

Collections Mainframe Developer

Confidential

Responsibilities:

  • Development and Unit testing of Batch Programs
  • Provide System testing/Integration testing support
  • Fix the defects in System testing/Integration testing
  • Resolve production issues and make sure to meet the SLA of the Batch process.

Mainframe Developer

Confidential

Responsibilities:

  • Requirement analysis, Design, Development and Unit testing.
  • Development and Unit testing of Batch Programs
  • Provide System testing/Integration testing support
  • Fix the defects in System testing/Integration testing
  • Resolve production issues and make sure to meet the SLA of the Batch process

We'd love your feedback!