We provide IT Staff Augmentation Services!

Informatica Lead Developer, Scrum Master, Big Data Engineer Resume

3.00/5 (Submit Your Rating)

SUMMARY:

  • Over 12+ years of IT experience in architect, designing and building Data Warehouse/Data Mart, and Business Intelligence applications.
  • Finance (Liquidity Risk Management) and Insurance (WC Premium and Loss) Domain experience.
  • Experienced as Technical BA for the Treasury projects at Confidential .
  • Proficient in end - to-end Project execution.
  • Experience with INFORMATICA Power Center.
  • Working in Big data,Hadoop,Hive,Impala,Sqoop and Spark.
  • Over 4 years of Project Management experience.
  • Experienced in Agile and Waterfall methodologies in Project execution.
  • Experience in Oracle, Teradata,DB2 and SQL Server Relational Databases.
  • Proficient in UNIX and Linux shell scripting.
  • Experience in Data profiling and Data Quality.
  • Experienced in complete SDLC process as per CMMi level 5 standards.

TECHNICAL SKILLS:

ETL Tools: INFORMATICA POWER CENTER 8.x/9.x/10.x

Data Management: INFORMATICA Meta Data Manager

Big Data solutions: Bigdata,Hadoop,Hive,Impala,Sqoop,Spark

Data Quality: INFORMATICA Data Profiling

Scheduling: Autosys, ControlM

Agile Tools: Odessey

Project Management: Microsoft Project Plan, Estimators

Business Modelling: Microsoft Visio

OLAP Tools: OBIEE

Database: Oracle 10g/11g, Teradata,DB2 and SQL Server 2008

GUI: XML

Methodologies: Star Schema, Snowflake Schema, Dimensional Modeling

Languages: SQL, PL/SQL, Shell Scripting

Testing Tool: HP Quality Center

Operating systems: UNIX, Linux, Windows/XP

Problem/Change Management: Service snow

EXPERIENCE DETAILS:

Confidential

Informatica Lead developer, Scrum Master, Big Data Engineer

Responsibilities:

  • Designed the ETLs that deal with millions of data from ~36 sources on daily basis.
  • Project execution in Agile methodology and waterfall model.
  • Project accountability
  • Managed Key Client Relationships and Interactions
  • Handling Huge volume of data and processing structured data using Big data-Hadoop, HIVE, Impala, Sqoop,Spark and scala.
  • End to end project management.
  • Capacity planning of resources and work allocation.
  • Requirements gathering and ensuring client sign offs.
  • Build test strategies and plan.
  • Seek QA approvals for all the documentations as per CMMi level 5 standards.
  • Manage Environments for releases and Change Management.
  • Designed and architected the data warehouse for FED 5G (BASEL III) Reporting. This huge project involved consolidation data from more than 15 heterogeneous sources (Oracle/Sybase/XML/text/CSV), calculation and standardization of data for XML for reporting.
  • Architected and implemented the dynamic data transformation engine based on business rules - helps in reducing the production deployments and application downtime.
  • Designed and architected the IHC (Intermediate Holding Company) expansion to enable multiple legal entities reporting to the US Fed. Consolidation and elimination logic required to support the entity reporting for FR 2052a based on the latest entity guidance on the reporting entities (Legal Entities and Intercompany Affiliate Counterparties) as required by the regulatory reporting filing.
  • I have taken up the additional role of Technical BA in the event of BA team ramp down. Played a major role in the LCR (Liquidity Coverage Ratio) project as a Tech BA and DWH Architect.
  • Designed and implemented FR-Y 15 Schedule G and D reports and submitted to the Regulatory team.
  • Collateral class enrichment for FR 2052a.
  • Automation of feed regressions in case of failed file deliveries from upstream applications. The process will pick up the latest available source files and modify the loading logic appropriately to prevent the movement of maturity buckets. This will prevent the delays in 5G report submission to Fed.
  • Designed and architected the application for addition, maintenance and approvals of Legal entities and its affiliates - a combination of UI and ETL. This facility will enable business to maintain the list of IHC legal entities reportable in FDM which impacts 5G, MI, and LCR reporting. Along with the functionality for the business to be able to review new IHC Legal Entities.
  • Database and ETL performance tuning - for parallel processing of feeds to reduce the data latency and quick turnaround time for the business.
  • Designed the fix for Intercompany symmetry breaks via ETL- Per FR 2052a Post-Submission Validation Checks (the Instruction) issued by Federal Reserve, it is mandated to have the symmetry checks for all intercompany transactions based on the product aggregation mapping provided in the Instruction. This is a potential operational risk if done manually.
  • Designed and automated blacklisting of securities based on Bloomberg data. This automated decision making process will browse through the security values of the past seven years, checks if the variation in value falls beyond a threshold and blacklists such securities.
  • Working on Big data framework to move the data from the traditional RDBMS (Oracle) to HDFS.
  • Developed Spark Scala program to load the data from file to STG and STG to Managed layer applying the enrichment logic.

Confidential, New York

Senior Software Engineer.

RDBMS: Oracle

Tools: Informatica 8.6.1/ Informatica 9.1

Responsibilities:

  • Working as an ETL Developer and involved in all stages of life cycle from Analysis, Design, Build, Test and deploy the code.
  • Understand functional requirements and develop technical requirements and solution blueprint for this Module.
  • Working on creating mappings, mapplets, sessions and workflows for the proposed ETL solution in Informatica.
  • Worked on the Mapping analyst for excel to generate the simple mappings.
  • Schedule the jobs using autosys.
  • Prepare design documents (both the high Level design and Low level design) for the ECM application.
  • Perform calculations according to the requirements from the client.
  • Analyzed defects and provided root cause analysis for Confidential capital markets.
  • Analyze the Confidential Client profitability (Sales) and the Confidential Capital markets data and calculate the profit generated.
  • Worked on Production enhancements and bug fixing for Confidential Capital markets application

Confidential

Senior Software Engineer.

RDBMS: DB2

Tools: Informatica 8.6.1

Responsibilities:

  • Working as an ETL Developer involved in creating mappings, mapplets, sessions and workflows for the proposed ETL solution in Informatica.
  • Understand functional requirements and develop technical requirements and solution blueprint for this Module
  • Performing the Data profiling and data analysis to check the data quality whether the existing data will fit for the business requirements.
  • Involved in the creation of Project Estimation Plan.
  • Involved in the creation of rough end data Model and analyzed the columns for Source, Target and Work Tables.
  • Prepared Unit Test Plan (UTP) documents for all the test cases for the developed mappings.
  • Extensively used DB2 to pull the data and load the data.
  • Developed technical design documents (both low level and high level-ETL technical design document), mapping specifications (source to target mapping docs)

Confidential

Software Engineer

Technologies Used: Informatica 8.6.1

Responsibilities:

  • Involved in the Analysis, Design, Coding and testing of the ETL mappings.
  • Created LLD s, HLD's and test case documents.
  • Created documents like code review check list, Migration check list.
  • Involved in performance tuning of Informatica jobs.
  • Created and performed unit testing.
  • Involved in all the existing releases and helped the team in resolving critical issues.
  • Created reusable objects like Mapplets, Reusable Transformations and work lets.
  • Involved in the design of historical load and incremental load Process.
  • Develop ETL deployment plan and ETL operations manual to coordinate and drive ETL activities in Test and Production environment.
  • Worked with the QA team to test the code.

Confidential

Software Engineer.

Technologies Used: Informatica 8.6.1, Teradata, UNIX

Responsibilities:

  • Prepared ETL Technical design documents as per the given business requirements.
  • Developed Mappings & Workflows as per Business logic, quality and coding standards prescribed for the module.
  • Used Informatica tool to build Complex Mappings and extensively used various Transformations like Source Qualifier, Aggregator, Lookup, Filter, Expression, Router and Sorter etc.
  • Prepared Unit Test Plan (UTP) documents for all the test cases for the developed mappings and performed Unit Testing.
  • Extensively used Teradata loader scripts for bulk load utilities like MLoad, TPump etc.
  • Created re-usable objects like mapplets, transformations, worklets etc.
  • Also, involved in fixing of the production issues of released phases.
  • Involved in migrating the Informatica objects from one environment to other

We'd love your feedback!