We provide IT Staff Augmentation Services!

Sr. Application Engineer Resume

5.00/5 (Submit Your Rating)

SUMMARY:

  • Possesses 8.8 years of experience working with different clients and projects of Confidential
  • Gained good knowledge of design and implementation of Data Warehousing, ETL and Integration projects.
  • 2.5 years of working experience in design, development and implementation of business solutions using Hadoop/Big Data Ecosystem, HDFS, Map Reduce, Hive, Impala, Sqoop and Spark.
  • Currently working on AWS platform using services like S3, EMR, EC2, Athena and Presto.
  • Experience in writing reusable components and UDFs using Python scripting.
  • Strong understanding of NoSQL Database DynamoDB
  • Experience in using GitHub, BitBucket and Jenkins in code development and deployment cycle.
  • Hands - on experience of managing Hadoop cluster and services using Cloudera manager.
  • Extensive Working experience on Informatica Powercenter Tools to provide a repository which supports key operational work as well as business rules.
  • Worked on real time Data Sync-up and replication using Informatica Data Replicator.
  • Experience includes working for Retail and Banking clients.
  • Currently working at US client location as a Big Data Engineer.
  • Good experience of working on different warehousing components of Oracle DB
  • Good working experience of creating and upgrading UNIX scripts
  • Worked on different batch control and monitoring tools like Control-M, Auto Sys and AirFlow
  • Currently working under AGILE development model using Jira as monitoring tool.
  • Experience in designing, reviewing and running QA test cases using HP Quality centre.
  • Experienced in all phases of SDLC (analysis, design, development, testing and deployment).
  • Actively involved in requirement gathering with business users and business analysts.

TECHNICAL SKILLS:

Hadoop Ecosystem: Spark, Hive, HDFS, Sqoop

Programming Language: Python

ETL: Informatica Power Centre,IDR

Databases: Oracle 11g, SQL Server, Dynamo DB

Testing Tools: HP Quality Centre

Scheduling Tools: Control-M, Autosys, Airflow

PROFESSIONAL EXPERIENCE:

Confidential

Sr. Application Engineer

Environment: Hive, Spark, S3, EMR, Airflow, Python, Oracle 11g, UNIX Server, Agile Jira

Responsibilities:

  • Create data pipelines and flows using Pyspark and HQLs using different events of Nike’s applications as raw data, filter and load consumer response data in AWS S3 locations used by Hive external tables.
  • Work with different file formats of Hadoop like JSON, AVRO, Parquet and determining what to use in each stage.
  • Enhancement and using PYSPARK application for parsing and filtering JSON source data.
  • Developed Python based scripts for dynamic partitions adding to hive stage table, verifying JSON schema change of source files, and verifying duplicate files in source location.
  • Spark jobs DAG creation, determining batch dependencies and job triggers using Airflow as the batch job automation tool.
  • Monitoring job logs on YARN manager, failure RCA and resolution steps.
  • Participate in daily scrum meetings for discussion on project progress.
  • Develop process workflow document, artefacts docs and flow handbook for production support.
Confidential

Sr. Application Engineer

Environment: Informatica 9.1, IDR, Teradata, Python, Oracle 11g, UNIX Server, Autosys, Agile Jira

Responsibilities:

  • Gathering Mapping requirements by meeting with stakeholders, document analysis, business process descriptions, use cases and scenarios.
  • Create ETL mappings using Informatica PowerCenter from varied heterogeneous data source to a central data warehouse.
  • Create Python based reuseable frameworks using Pandas Library for data reconciliation between source and target systems, framework for data file auditing and consumption log to keep track of daily input files.
  • Use Teradata loader connections like Fload, Mload for performance optimization.
  • Work on Design, development, performance tuning and maintenance of different Database objects like Table, Views, Materialized Views, Functions, Triggers, Sequence generator as per project requirement.
  • Batch design and scheduling using Autosys job automation tool.
  • Creation and execution of unit test cases, system test cases, code debugging, test data setup as per business needs. Provide support for execution of user acceptance testing.
  • Keeping client Up to date of team progress in daily scrum meetings, on different phases of development and testing.
  • Co-ordinating with different business teams for seamless flow of data
  • Design review, Quality Assurance and Peer reviews
  • Deployment and co-ordination with support teams
Confidential

Offshore Team Lead

Environment: Informatica 9.1, Oracle 11g, UNIX Server, Control M

Responsibilities:

  • Breaking high-level business and user requirements into high level design and detail level design documents.
  • Assigning and monitoring progress of ETL code development with Offshore team members
  • Defining quality attributes constraints and other non-functional requirements.
  • Review test cases, perform system testing and assist business users in UAT. Work closely with QA teams on establishing test plans and execute test cases.
  • Work closely in resolving production issues and providing solutions on ad-hoc data issues.
  • Working with Informatica admin team on health check and performance of various workflows of project.
  • Responsible for production migration plan.
Confidential

Senior Developer

Environment: Informatica8.6, Oracle 10g, UNIX Server, Control - M

Responsibilities:

  • Design of Whole of project artefacts (WOPA) document. This documents the Data Quality Issues faced by business bankers in front end reporting.
  • Providing the fixes which include code changes in existing SQL or ETL code and development of new queries and Control M Job schedules
  • Lead the complete migration activity.
  • New Features and change training to end Users and application owners.
  • Working with Production support team in resolving operational issues.

We'd love your feedback!