We provide IT Staff Augmentation Services!

Informatica / Cloud Application Developer Resume

2.00/5 (Submit Your Rating)

Tysons, VA

SUMMARY:

  • Over 10 years of IT experience with strong background in software development in designing and developing Business Intelligence solutions and Decision support system using Informatica power center, DVO,PMPC,IDQ ,MDM, IICS and Power Exchange.
  • Worked with various clients in the Health Care, Higher Ed, Food &Nutrition and Financial.
  • Experience with advanced ETL techniques including Slowly Changing Dimensions, reusability, data validation and profiling, change data Capture (CDC), error processing etc.
  • Experienced in creating entity relational & dimensional relational data models using Kimball Methodology i.e., Star Schema and Snowflake Schema.
  • Proficient in Requirement Analysis, Technical Design Specification creation, Development, Testing, Star Schema Design and implementations.
  • Experienced with creating ETL specifications as per the customer requirements and Unit Test Cases for ETL code and doing peer code reviews.
  • Highly proficient in using Informatica Power Center to move Data from multiple heterogeneous sources like Oracle, SQL Server, Sybase, SAP and flat files into a BI specific target area such as Staging area, Warehouse and Data Marts.
  • Certified Confidential Analyst specialist & master, Confidential Administrator specialist.
  • Heavily used Informatica SAP Power Exchange Adaptor and Power Exchange Adaptor for Adabas Legacy system.
  • Coordinated with the SAP team in Using LSMW a SAP R/3 based tool for transfer of data from legacy systems to SAP system.
  • Good experience in UNIX and Perl scripts for automation of ETL process and Informatica pre & post session operations.
  • Expertise in developing SQL and PL/SQL codes through various procedures, functions and packages to implement the business logic.
  • Well Versed with different SQL Development Tools like TOAD, DB Visualizer, Oracle SQL Developer, Oracle SQL Data Modeler.
  • Very well versed with most SDLC methodology (Waterfall, Agile)
  • Knowledge on ETL for consuming data from big data sources and to load Hive Data Warehouse.
  • Highly Skilled in SQl Query Tuning and Informatica Performance tuning.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice - versa.
  • Ability to quickly master new concepts and applications

TECHNICAL SKILLS:

ETL Tools: Informatica 10.x,9.x/8.x, Informatica Power Exchange 8.xInformatica (DVO,PMPC,MDM, IDQ, IDE), Data Stage, SSIS.

Methodologies: E-R Modeling, Star schema, Snowflake schema.

Programming Languages: SQL, PL/SQL.

Databases: Oracle, MS SQL Server, Teradata DB2.

Reporting Tools: MicroS trategy, Business Objects, Cognos Report.

Data Modeling: Erwin 4.2.2, Oracle SQL Data Modeler.

Operating Systems: Windows XP/NT/2000, UNIX.

Database utility tool: Toad, Oracle SQL Developer, SQL* PLUS, LSMW, DBVisualizer.

Scheduler: Informatica scheduler, Autosys, Tidal, ControlM

Big Data/Hadoop Ecosystem: HDFS, Hive, Sqoop, Hue.

WORK EXPERIENCE:

Confidential, Tysons, VA

Informatica / Cloud Application Developer

Responsibilities:

  • Extensively used Informatica Cloud and PowerCenter Client tools - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, Informatica Repository Manager and Informatica Workflow Manager.
  • Developed complex ETL mappings making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected and unconnected Lookup, Expression, Router, Filter, Normalizer, Aggregator, SQL, Transaction Control and Sequence Generator transformations Webservices, Java and HTTP.
  • Upgraded Informatica PowerCenter from 9.6 to latest version 10.4.
  • Used informatica data validation option for upgrade testing and migration testing.
  • Installed and configured Informatica Proactive Monitoring for PowerCenter
  • Used informatica salesforce transformation to import salesforce objects to Designer.
  • Created Informatica mappings to load currency exchange rates from oanda in to JDE.
  • Implemented new generation data model for sales and marketing Datamart’s.
  • Extensively used informatica cloud data replication task to load salesforce objects to staging tables.
  • Develop impactful reports, Dossiers and dashboards using Confidential Developer and Web.
  • Worked on active directory and usher pro integration to populate employee data on usher pro.
  • Used Informatica cloud (IICS)to load Sales force objects into AWS S3 buckets.
  • Created Gulp jobs to load data from AWS S3 bucket to Redshift Data Lake.
  • Worked on helpdesk tickets and resolve within the SLA and maintain documentation to help in cross across the team.
  • Implementation of Confidential Cloud features through entire lifecycle of feature development. This includes input on design specifications, implementation of functionality, testing, analyzing and optimizing the implementation that support expanding Confidential ’s platform capabilities.
  • Design, implement and optimize Confidential platform deployment mechanisms working on infrastructures such as: physical hardware, AWS Cloud and Azure Cloud systems
  • Troubleshoot and triage issues related to software and hardware components and Confidential Product Suite implementation.
  • Certified Confidential Analyst specialist & master, Confidential Administrator specialist.
  • Participated in peer code reviews, knowledge sharing, Scrum meetings and mentoring.
  • Provided Expert Services for stress testing, sizing, configuration audits, on-site implementation of best practices, performance tuning, ETL routines, Data modelling, among others

Environment: Informatica Cloud, Power Center 10.4/10.2/9.6 , DVO, PMPC, SQL Server2016, Amazon Redshift, SQL, Confidential web, Confidential Developer, GitHub, AWS S3, Jenkins, Flat Files (fixed and delimited), XML Files.

Confidential, Woonsocket, RI

Senior ETL /Application Developer

Responsibilities:

  • Technology Lead, Onsite Coordinator
  • Participated in all phases of the Development Life Cycle
  • Worked on all the Unix/Informatica setups required for Informatica upgrade.
  • Involved in requirement gathering from data stewards, data model design, for the MDM hub configuration for applications.
  • Creating the landing, staging and base object tables in MDM hub console based on the data model design.
  • Developed BDM mappings to do the initial data loads into MDM landing area based on the business and data quality and cleanse rules.
  • Responsible for Requirements gathering, Technical System Design, Development, Testing, Code Review, Code migration, UAT, Job scheduling, Offshore Coordination
  • Developed complex inbound/outbound mappings and reject strategies based on the business rules to load the IDL (initial data loads) and incremental loads using IDQ.
  • Developed re-usable components (Address Validation, Parsing, Standardization, etc.) based on the business requirements using IDQ.
  • Actively participate in understanding business requirements, analysis and designing Data Migration/Integration Process from Business Analyst
  • Interacted with Data Architecture group, PM, Project team to finalize the TSD design strategies. Created various diagrams as part of the Project deliverable and physical database design.
  • Use Informatica Power Center and Unix Scripts through Control-M tool to extract, transform and load data
  • Responsible for code migration, Code review, test plans, test scenarios, test cases as part of Unit/Integrations testing
  • Automate the Production jobs using Informatica Workflows & Shell script through Autosys.
  • Designed and developed high performance ETL design plans, Staging and Target tables design to load large volumes of data (Flat file to table, Table to Table, table to Flat file) utilizing Informatica Power Center, Parameter File
  • Performance Tuning on the Targets, Sources, Mappings and Sessions used Techniques like reusable Transformations, reusable Caches and partitioning etc.
  • Developed test plans, test scenarios, test cases as part of Unit/Integrations testing
  • Responsible for setting up Production Jobs and created Production Operational Document
  • Leading a team of 2 developer at Onsite and 6 developers at offshore.

Environment: Informatica PowerCenter, MDM, IDQ, Unix, Teradata, Autosys, TOAD, MS Office

Confidential, Smithfield, RI

Senior ETL Developer

Responsibilities:

  • Extensively used Informatica Client tools - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, Informatica Repository Manager and Informatica Workflow Manager.
  • Developing reusable modules for Hive, Sqoop, (Hadoop Distributed File System)HDFS files operations.
  • Developed HQL scripts as per ETL specs to move data from source layer to INT layer.
  • Developed Sqoop scripts to import and export data into/from Oracle and HDFS
  • Used Quest Data Connector oraoop to improve the performance. (Movement of data between Oracle and Hadoop.)
  • Utilized Agile Scrum Methodology to help lead manage and organize a team of 5 developers with regular code review sessions.
  • Designed and developed Mapplets, shortcuts for faster development, standardization and reusability purposes.
  • Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables, Session Parameters and Parameter Files.
  • Used Workflow Manager for Creating, Validating, Testing and running the sequential and concurrent Sessions and scheduling them to run at specified time.
  • Used pre-session and post-session scripts for dropping and recreating indexes before and after Loading data in to target table to optimize performance.
  • Weekly meetings with technical collaborators and active participation in code review sessions with senior And junior developers
  • Very familiar with the debugging features of Informatica debugger which I used to debug Complicated expressions for bug fixes.
  • Written Hive queries for data analysis to meet the business requirements

Environment: Informatica Power Center 9.1.0, DB2 AS/400, Oracle 10g, SQL Server, SQLFlat Files (fixed and delimited), XML Files, TOAD, ControlM, HDFS, Hue, Sqoop, HIVE and Linux.

Confidential, Williamsville, NY

Senior ETL /Application Developer

Responsibilities:

  • Worked on the building in MDM - EDW Warehouse by creating lot of Informatica extracts and SQL feeds for Kalido from the legacy source systems and load to staging/landing area, then staging to Warehouse (MDM - EDW).
  • Worked on data conversion projects to move data from old source system to New source system (Health Rules) by creating Informatica programsdatabase and UNIX scripts
  • Worked on NIA Oncology project. to ensure that members receive the most appropriate radiation therapy treatment consistent with our medical policy, evidence-based clinical guidelines and standards of care followed for treatment.
  • Worked with business analysts for requirement gathering, business analysis, and translated the business requirements into technical design documents.
  • Designed and developed lot of Informatica mappings by extracting data from data warehouse and Source systems for Member enrollment, Medical and Pharmacy claims, Eye-med claims, Lab Results, Pseudo claims, Provider, Supplier, Product, Group, Member Premium, etc.
  • Writing data to flat files, EDI formats (834/837) and xml files that are sent to several Thirdparty vendors.
  • Also Loaded data to Staging tables for various business purposes.
  • Worked on Decommission Project where the legacy Jobs decommissioned into MDM Environment and MDM Release 5,6 and 7 and production fixes.
  • Extensively used Informatica Client tools.
  • Extracted data from various source systems like DB2, Oracle and flat files and loaded to a relational data warehouse and flat files.
  • Developed complex ETL mappings making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected and unconnected Lookup, Expression, Router, Filter, Normalizer, Aggregator, SQL, Transaction Control and Sequence Generator transformations.
  • Created and scheduled Job groups and jobs using Scheduling tool, Tidal.
  • Used Pushdown Optimization(Source,Target,Full), Thread Statistics, shared caches, Performance statistics files and Reject files to improve performance in mappings.
  • Debugged and implemented the best practices in mappings, sessions, and workflows for data extraction and loading from source to target databases.
  • Extensively involved in performance tuning of the Informatica Mappings/Sessions by increasing the caching size, overriding the existing SQL.
  • Exposure to Electronic Data Interchange (EDI) for Member Enrollment, Eligibility and Claims.
  • Created and adapted existing UNIX Shell scripts for automation of different processes.
  • Created Catalogs/Categories in Kalido and loaded the data into categories through file and SQL feeds.
  • Worked on SQL tools like TOAD for oracle and Surveyor for AS/400 to run SQL queries and validate the data.
  • Performed Unit Testing for ETL Mapping and created Test cases Documents.
  • Prepared Application workbooks providing guidelines to trouble shoot the errors occurred during the run time and instructions on how to restart the loads for Production Support team.
  • Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.
  • Actively supported in production support for early life failures.

Environment: Informatica Power Center 9.1.0/8.1.1 , DB2 AS/400, Oracle 10g, SQL Server, SQL, Flat Files (fixed and delimited), XML Files, EDI, SVN, Reflection, Unix, TOAD, Surveyor, Power, Tidal, Altiris, Kalido, AIX, Linux.

Confidential

Senior ETL Developer

Responsibilities:

  • Worked with the Functional experts from various departments to develop the Adabas specifications for the different files which were to be transported to Oracle.
  • Developed mappings to load Mainframe ADABAS data to Oracle stage schemas using Power Center and Power Exchange.
  • Created and Used Power Exchange mappings with the help of ADABAS developers who understood the Adabas data-structure.
  • Worked to create the Modernization, Consolidation, Conversion and Replenishment specifications which were based on the Functional specs developed by the Functional team.
  • Used Informatica IDE and IDQ to profile and cleanse data to assess the Data Quality of Historical- transactional data and cleansed it using IDQ .
  • Used Address and Phone Number scrubbing transformations from IDQ to clean data using the profiling analysis from IDE.
  • Created Conversion files for different functional areas in SAP using cleansed ADABAS data which the SAP engine would use to become fully operational.
  • Read data from Z-tables, SAP hierarchies using Informatica plug-ins to SAP with ABAP code generation methods using both File staging and Streaming modes.
  • Used IDOC extraction methods in real time mode using "IDoc Integration Using ALE".
  • Created SAP/ALE IDOC INTERPRETER transformations to interpret SAP IDOC segment data.
  • Made RFC calls from Informatica using RFC/BAPI transformation.
  • Implemented SCD 2 dimensions with effective Date Ranging technique in project.
  • Used Source Qualifier Look up t echniques to look up application data structures from ADABAS and SAP which could not be accessed using conventional look up transformations.
  • Extensively used Informatica Team based option (Versioning) to systematically control the versions of code in different stages of development in the Development environment.
  • Used SQL transformation in Query and script mode with both dynamic and static connections on oracle database to dynamically connect to different databases and execute DML statements. Also used the Query mode to execute static and dynamic queries .
  • Used Normalizer transformation to normalize and expand repeating (redefines clause) PE and ME groups in ADABAS data structures.
  • Used aggregator transformation as part of an Informatica mapping technique to flatten and load repeating (redefines clause) PE and ME groups to ADABAS data structures.
  • Used pre-session and post-session variable assignment in Informatica for simulating multithreading scenarios for Load balancing scenarios to gain performance.
  • Used pre-session and post-session variable assignment to transport Counters and values across sessions when required.
  • Used techniques like Constraint Based loading to load tables in the same active source group. Also adjusted properties in session level like “ Treat Source Rows as ” and created logical keys on targets to implement Constraint Based loading.
  • Modified various target properties like Update-override for relational target and On Commit property for XML targets to get the best performance from the mapping.

Environment: Informatica Power Center 9.0.1/8.6.1 , Informatica Power Exchange 9.0.1, IDE Informatica Data Analyzer (explorer), IDQ Informatica developer (data quality), PL/SQL, Toad, UNIX.

Confidential, Memphis, TN

Senior Informatica Developer

Responsibilities:

  • Created ETL mappings using Informatica Power Center to move Data from multiple sources like Flat files, SQL Server, DB2 into a common target area of Staging.
  • Worked on conversions of Legacy to SAP through LSMW conversion programs using Informatica.
  • Created and deployed ABAP code on SAP systems to execute mappings which pull data from SAP sources and also load data into SAP Z tables.
  • Performed SAP transports for ABAP code with the help of Basis personnel which were used in Informatica mappings. extensively used XML sources and XML parser transformation which were used to interpret XML files from outside vendors and load it into the stage area from where it went to SAP.
  • Read data from XML sources with various Parent-Child groups with the help of sequence counters from Informatica which come as part of keys from the source.
  • Normalized and De-Normalized XML group data as per requirements to load into target.
  • Worked with various XML custom properties with the help of Informatica support to solve many issues.
  • Used Indirect File lists to load multiple files from various locations in one session.
  • Performed SQL query optimization using hints, Indexes and Explain plan
  • Used the stored procedure transformation to call stored procedures in various stages of the mappings depending on the requirement.
  • Implemented SCD methodology including Type 1, Type 2 changes to keep track of historical data.

Environment: Informatica Power Center 8.6.1, DB2, Informatica Power Exchange 8.6.1, PL/SQL, Squirrel, UNIX.

We'd love your feedback!