We provide IT Staff Augmentation Services!

Etl Developer/informatica/sap Data Services Resume

0/5 (Submit Your Rating)

SUMMARY

  • IT Professional wif 13+ years of leadership experience in development, analysis, testing, training.
  • Strong understanding of ETL design and development, PL/SQL stored procedures, data warehouse, star schema, FACT, Dimensions, slowly Changing Dimensions, data transformation, data mapping, data cleansing, data profiling.
  • Strong skills in Oracle data partitioning techniques, parallel execution, dynamic SQL.
  • Over 7 experience wif INFORMATICA PowerCenter 8.4, 9.0, 9.2.3, 10.2, developing complex transformations, mappings, Mapplets and workflow.
  • Hands - on experience wif Shell scripting, job scheduling, AutoSys, performance turning.
  • Highly proficient in SQL (Oracle, SQL Server, Netezza, Greenplum, Teradata, DB2), PL/SQL, developing procedures, user-defined functions, packages, synonyms, complex analytical functions.
  • Strong understanding of data validation, meta-data, data governance, master data management, data migration, data archival, data-cleansing, data extraction, data transformation, data loading.
  • Advanced skills in SQL, UNIX, PL/SQL, VBA, Perl, SAS.
  • Good understanding of data normalization, first normal form, second normal form, third normal form, data mart, data warehouse, star schema, facts and dimension, object-oriented programming, source-to-target mapping, data quality, meta-data, data retention and archival.
  • Advanced skills in reporting applications like Cognos, Business Objects, SAS, OBIEE

TECHNICAL SKILLS

Front-end Tools: Cognos, Business Objects, SAS, MicroStrategy, OBIEE

ETL: Informatica PowerCenter, Informatic Metadata Manager, SSIS

Programming Languages: PL/SQL, JAVA, Perl, VBA, VB, UNIX, SSIS, Python

Databases: MS Access, Oracle 11g, Teradata, DB2, Sybase

PROFESSIONAL EXPERIENCE

Confidential

ETL Developer/Informatica/SAP Data Services

Responsibilities:

  • Analyze ETL logic developed in SAP Data Services to capture teh current business logic.
  • Perform gap analysis of current state and future state of system design, data model and business logic.
  • Develop data mapping document (DMD), Data Requirement Document (DRD) for all of teh impacted database objects and downstream consumers of data warehouse data.
  • Refactor existing ETL jobs/workflows/dataflows. Develop new ETL jobs/workflows/dataflows in light of data model/business logic changes.
  • Recommend data model changes in teh data warehouse to account for upstream data model changes.
  • Work wif QA team in developing test cases, test scripts and review QA test strategy.

Confidential, McLean

Informatica Metadata Manager/Data Analyst

Responsibilities:

  • Load DataStage jobs from different project components into Informatica Metadata Manager as an XML file into teh AnalytixDS server.
  • Generate parameter files corresponding to teh exported datastage jobs XML.
  • Load teh teh DataStage jobs into teh AMM Server and analyze metadata lineage.
  • Scan New database objects from different source systems for loading into Metadata Manager.
  • Analyze reasons for missing linkages in IMM. Work wif AnalytixDS to re-establish lineage.

Confidential, Charlotte

Informatica Developer/ IDQ

Responsibilities:

  • Develop Informatica mappings, sessions, and workflows to migrate data from multiple source systems into teh AML data warehouse.
  • Developed Informatica Data Quality transformations like Address validator, Parser, Labeler, Match, Exception, Association, Standardizer..
  • Optimize Informatica mappings, workflows for performance improvements for faster data loads.
  • Develop PL/SQL stored procedures, packages, functions to implement complex Informatica mappings, transformations and workflows.
  • Develop robust UNIX shell scripts to automatically schedule Informatica workflows, parameters
  • Develop routines for extracting Informatica objects from multiple repositories and for migration into test and production environments.
  • Key participant in project planning, estimation, solution development, testing and delivery.
  • Provide analytical support to different stakeholders, analyze issues/enhancements.

Confidential, San Francisco

Informatica Developer/PL/ SQL Developer / IDQ

Responsibilities:

  • Gather data requirements for migrating data from heterogeneous source systems to landing area.
  • Develop PL/SQL stored procedures, PL/SQL user-defined functions, views, synonyms, packages to migrate data from source systems to data warehouse.
  • Work wif data modeler to create physical, logical diagrams.
  • Developed Informatica Data Quality transformations like Address validator, Parser, Labeler, Match, Exception, Association, Standardizer..
  • Develop shell scripts to automate Informatica workflow schedules and automate database tasks.
  • Develop high-performance queries, complex joins and advanced indexing techniques to optimize database operations.
  • Develop complex Informatica mappings, sessions and workflows in line wif business rules to migrate data from teh source systems to teh data warehouse.
  • Develop stored procedures, user-defined functions, packages and other Oracle objects to migrate teh data from source systems to teh data warehouse.
  • Generate datasets by writing SQL queries that provide insights for teh project stakeholders in designing teh new application.
  • Developed complex built-in transformations like lookup, joiner, xml and web services.
  • Developed workflows to expedite large-volume data movements by implementing push-down optimization, partitioning in Informatica.
  • Develop UNIX shell scripts to schedule Informatica workflows.
  • Analyze large datasets using Python/Perl.

Confidential, Bridgewater

SQL Developer- Cum-SSIS/Informatica Developer

Responsibilities:

  • Develop advanced database objects, stored procedures, user-defined functions to represent business logic in technical terms.
  • Develop high-performance queries, complex joins and advanced indexing techniques to optimize database operations.
  • Profile data, cleansed data, applied business rules to standardize source data.
  • Developed Informatica Data Quality transformations like Address validator, Parser, Labeler, Match, Exception, Association, Standardizer..
  • Develop new process to extract data from existing systems and add additional data to support Group Voluntary and Work Life Benefits (GVWB) broker marketing.
  • Create Informatica mappings, sessions and workflows for ETL process to get teh data into teh enterprise data warehouse (EDW)
  • Developed complex built-in transformations like lookup, joiner, xml and web services.
  • Developed workflows to expedite large-volume data movements by implementing push-down optimization, partitioning in Informatica.
  • Developed re-usable transformations like Mapplets to simplify and expedite Informatica development.
  • Perform Salesforce.com data validation for teh monthly reports.

Confidential, Los Angeles July 2014 - July 2015

Informatica Developer

Responsibilities:

  • Work wif teh business SME and business analysts in developing database objects, stored procedures, user-defined functions, packages to capture business logic.
  • Develop high-performance queries, database-optimized Constructs to improve performance.
  • Identify performance bottlenecks and refactor database objects to improve response times.
  • Lead teh design, development, testing and documenting of all aspects of ETL for migrating teh data from multiple source systems into staging area
  • Develop Informatica mappings, workflows to migrate data from teh systems to teh data warehouse.
  • Develop PL/SQL objects - procedures, functions, packages to migrate teh data.
  • Key participant in project management meeting. Influence decisions on solution development by carefully evaluating proposals and proposing alternative solutions.
  • Aid in teh development of Informatica mappings and workflows to implement ETL solution for data movement to staging.
  • Build relationship wif senior management team, project partners and other stake-holders through sustained high-quality deliveries, team-player attitude and can-do attitude.

Confidential, Calabasas, CA

SQL Developer

Responsibilities:

  • Develop complex SQL queries, optimize existing SQL objects to improve application performance.
  • Work wif Business to understand requirements and convert them into technical specifications.
  • Develop Stored procedures, user-defined functions, indexes, views, and packages.
  • Analyze large volume of data using UNIX/Perl and Python.
  • Develop SSRS Reports based modelled on legacy reports wif enhancements.
  • Work wif development team in communicating business requirements, evaluate and recommend technical solution and testing.
  • Develop key documents such as High-level design document, Low-level design document, accelerated change document, workflow diagrams, Use Cases, Data flow diagrams.
  • Key participant in project planning, estimation, solution development, testing and delivery.
  • Provide analytical support to different stakeholders, analyze issues/enhancements.
  • Evaluate technical solutions and influence design, coding, testing and delivery of l solutions.
  • Work wif teh development team and subject matter experts in documenting workflow logic while migrating FPS from .Net 1.0 to .Net 4.0.
  • Develop a test plan, test strategy and test scripts for component integration testing phase.

Confidential, Irvine, CA

Informatica Power Center Developer

Responsibilities:

  • Develop complex T-SQL queries to validate that segmentation for cost centers has been accurately captured for first lien, Held-for-Sale secondary mortgages.
  • Develop Informatica mappings, sessions and workflows to migrate data from teh sources systems to teh landing layer and to teh data warehouse.
  • Develop complex re-usable transformations like Mapplets, complex transformations like joiner, aggregator, XML and web services to implement business logic.
  • Analyze large volumes of data using UNIX/Perl and Python programs.
  • Develop complex SQL queries to validate that teh data archival process picks up accurately teh paid, charge-off and void customers for teh data archival process.
  • Work collaboratively wif teh development team in analyzing teh legacy application table structures/data elements for developing a robust, scalable code for high-volume data migration.
  • Perform root-cause analysis of data variance between expected and actual results for teh data migration/archival process.

Confidential, Calabasas, CA

PL/SQL Developer

Responsibilities:

  • Develop complex T-SQL queries to validate that segmentation for cost centers has been accurately captured for first lien, Held-for-Sale secondary mortgages.
  • Develop Informatica mappings, sessions and workflows to migrate data from teh sources systems to teh landing layer and to teh data warehouse.
  • Develop complex re-usable transformations like Mapplets, complex transformations like joiner, aggregator, XML and web services to implement business logic.
  • Analyze large volumes of data using UNIX/Perl and Python programs.
  • Performed data migration by loading data from several ODS systems to target systems using SSIS.
  • Follow-up on bug-fixes and execute regression testing after code-changes.
  • Develop Program Change Request (PCR) based on High-Level Design (HLD) and Low-Level Design (LLD) documents, and inputs from key project stakeholders.
  • Worked collaboratively wif team lead in developing high-level design documents and associated project artifacts for TDR-NCL and Clear Optimization Projects.
  • Work collaboratively wif teh Development team in clarifying Program Change Requests.
  • Respond to ad-hoc requests for mortgage-related data analysis from business and technical heads.
  • Gather requirements for validating data and develop SQL-based test scripts for data validation.
  • Test mortgage-related data (HFS and HFI loans) to verify that segmentation has been applied correctly and that journal entries are correctly being booked to teh new cost centers.
  • Perform root-cause analysis of data variance between expected and actual results GLSR project.

Confidential, Richmond, VA

Etl Developer/Report Developer

Responsibilities:

  • Worked closely wif AML business group to generate reports by consolidating data residing in different databases (Teradata & Oracle).
  • Generate reports in SAS/Business Objects by consolidating data residing in different databases (Teradata, SQL Server, and Oracle) to aid auditors to verify compliance wif regulations.
  • Consolidate data residing in multiple databases using SAS to generate business intelligence.
  • Migrate data from multiple source systems to development/test environment using SSIS.
  • Work collaboratively wif application development team to support software/application testing in conformity wif Confidential ’s Distributed Data Environment (DDE) standards.
  • Develop test cases/test scripts and other test artifacts as per Confidential ’s DDE standards based on business requirement document, technical design document to support Integration testing.
  • Validate data movement from source-to-target; identify probable causes for data variance.
  • Identify software defects and perform regression testing to validate defect resolution.
  • Validate data transformation during data movement from source-to-target.
  • Prepare appropriate data for loading into test environment for application testing.
  • Identify performance variance from actual and expected results and escalate issues to development team for software bug fixes.
  • Perform regression testing and provide sign-off on test results for code movement to production.
  • Analyze business requirements and evaluate teh software testing methodology for different components of CCB integration.

Confidential, Mclean, VA

Informatica Developer

Responsibilities:

  • Developed requirement documents for source-to-target mapping of column fields.
  • Performed data profiling, data cleansing, and data validation using SAS.
  • Performed data validation and data variance analysis using SAS.
  • Developed test strategy, data preparation and data cleansing and data loading.
  • Developed test plan, test cases and test scripts for UAT testing.
  • Worked collaboratively wif SIT in automating SIT test cases, analyzing and defect remediation.
  • Performed root-cause analysis of data variance in data quality report generated by SAS.
  • Automated UAT test cases using VBA. Analyzed variance in teh TDQ exception report in UAT.
  • Worked closely wif teh SAS Development team and teh ETL development team in fixing defects.
  • Performed manual testing of non-automated test scripts as well as regression testing.

Confidential, Boston, MA

Informatica Developer

Responsibilities:

  • Developed source-to-target documentation for mapping legacy data wif data mart layer.
  • Participated in data normalization exercises. Designed Fact, Dimension and Reject tables.
  • Analyze teh data model and suggest changes to model to support teh presentation layer in Cognos.
  • Performed root-cause analysis of data variance between source system and data mart.
  • Developed data quality rules based on inputs from business leaders.
  • Recommended strategies for improving teh data quality to avoid bad data in data mart.
  • Prepared test data for application testing and loaded it to teh Oracle database.
  • Developed testing strategy for new business intelligence application.
  • Developed test cases and test scripts using SQL for back-end testing.
  • Prepared test cases using UNIX shell scripting for regression testing.
  • Developed automated test cases using VB Scripts for parameterized testing in QTP.
  • Filed defects in Rational Clear Quest and assigned them severity levels.

Confidential, Herndon, VA

Business Objects Report Developer

Responsibilities:

  • Developed functional specification document for business reports on Confidential analytics.
  • Interviewed SMEs to develop teh validation and transformation rules for ETL process.
  • Prepared teh high-level test strategy, test plan and test scripts for SIT/UAT testing.
  • Developed test cases for regression testing in UNIX using Shell Scripting.
  • Prepared test data for application testing and loaded it to teh Oracle database.
  • Analyzed Mortgage-backed securities to establish teh correct basis for Interest only and principal only cash flows in line wif US GAAP FAS 91 guidelines.
  • Generated business reports using Business Objects, Hyperion and Cognos.
  • Analyzed MRB establish teh correct basis for cash flow amortization as per FAS 140.
  • Analyzed Confidential ’s Real Estate Mortgage Investment Conduit portfolio to establish teh correct basis for amortization of cash in line wif FAS 144 US GAAP guidelines.
  • Mapped teh cash flow data from teh Restatement Financial Data Warehouse (RFDW) into teh Securities Cost Basis Sub-Ledger (SCBSL) and teh general ledger.
  • Performed application testing in SIT phase, UAT and pre-production environment.
  • Performed regression testing in UNIX after bug fixes and after every new code release.

We'd love your feedback!