We provide IT Staff Augmentation Services!

Sr Etl Informatica Developer Resume

3.00/5 (Submit Your Rating)

Latham, NY

SUMMARY

  • 9+ years of IT Experience in analysis, design, development, implementation, administration and troubleshooting of Data warehouse applications.
  • Expertise in building Enterprise Data Warehouses (EDW), Operational Data Store (ODS), Data Marts using Multidimensional and Dimensional modeling (Star and Snowflake schema) Concepts.
  • Significant Multi - dimensional and Relational data modeling experience, Data Flow Diagrams, Process Models, ER diagrams with modeling tools like ERWIN & VISIO.
  • Involved in all phases of data warehouse project life cycle. Designed and developed ETL Architecture to load data from various sources like DB2 UDB, Oracle, Flat files, XML files, Teradata, XML, Sql Server, Hive, Impala and Azure targets.
  • Demonstrated expertise in utilizing ETL tool Informatica power center 10.2/9.x/8.x/7.x, Informatica Power Exchange 10.2/9.x, Informatica Big Data Edition 9x and Informatica Cloud (IICS/ICS) for developing and administering the Data warehouse loads as per client requirement.
  • Strong experience in implementing CDC using Informatica Power Exchange 10.2/9.x. Creating registration groups and extraction groups for CDC implementation.
  • Good understanding of Big Data concepts like Hadoop, Spark and Data Streaming
  • Experience working with Healthcare standard files like NCPDP files, HL-7, 835 and 837’s.
  • Extensive experience in developing mappings using various transformations like Source Qualifier, Expression, Lookup, Aggregator, Router, Rank, Filter and Sequence Generator transformations and various reusable components like Mapplets and reusable transformations.
  • Extensive experience in ETL testing at different environments like Unit Testing, Smoke testing, Functional testing, User acceptance testing and SIT (system integration testing).
  • Extensive experience is maintaining Informatica connections in all the environments like Relational, Application and FTP connections.
  • Experience in Informatica version upgrades and maintenance. Maintaining Informatica Server with repository health checks, NAS file system, Grid, Backup and Recovery options.
  • Experience working with data lake implementation. Involved in development using informatica to load data into Hive and impala systems.
  • Experience with Informatica Deployments using deployment groups and third party deployment tools.
  • Extensive knowledge in developing Teradata, Fast Export, Fast Load, Multi load and BTEQ scripts. Coded complex scripts and finely tuned the queries to enhance performance.
  • Profound knowledge about the architecture of the Teradata database.
  • Experience in writing PL/SQL, T-SQL procedures for processing business logic in the database. Tuning of SQL queries for better performance.
  • Extensive experience in implementation of Data Cleanup procedures, transformations Scripts, Triggers, Stored Procedures and execution of test plans for loading the data successfully into the targets.
  • Good experience in designing and developing audit, error identification and reconcile process to ensure the Data Quality of Data warehouse.
  • Excellent knowledge in identifying performance bottlenecks and tuning the Informatica Load for better performance and efficiency.
  • Experience in UNIX shell scripting, CRON, FTP and file management in various UNIX environments.

TECHNICAL SKILLS

  • Oracle 9i/10g/11g, SQL Server 2012/2008/2005 , Teradata v16, v13, v2R5, DB2 UDB 7.2/8.0
  • SQL Assistant, My SQL 5.0/4.1Editors (SQL Navigator, Toad)
  • Informatica (Power Center 10.2/9.6/9.1/8.6/8.5/8.1.2/8.1.1 ), Informatica data quality (developer tool ) 9.x
  • Data Transformation DT(Developer), Informatica MDM, Informatica Big Data Edition 9.x, Informatica Power Exchange 10.2/9.x
  • Unix Shell Scripting, Batch Scripting, PL/SQL, T-SQL, PERL
  • Data Modeling - Logical, Physical Dimensional Modeling - Star / Snowflake
  • Autosys, Control- M, Informatica Scheduler, Tivoli
  • MicroStrategy, Cognos, Business Objects XI, MS SQL Server Reporting services, Crystal Reports 10,
  • Hive, Impala
  • Service Now

PROFESSIONAL EXPERIENCE

Confidential, Latham, NY

Sr ETL Informatica Developer

Responsibilities:

  • Developed ETL programs using Informatica to implement the business requirements.
  • Communicated with business customers to discuss the issues and requirements.
  • Created shell scripting framework to fine tune the ETL flow of the Informatica workflows.
  • Used Informatica file watch events to pole the FTP sites for the external mainframe files.
  • Created Informatica Mappings, sessions including Command tasks like Event Wait, Event Raise, and Timer and assignment workflows on business requirements.
  • Extraction, transformation and loading of data were carried out from different sources like Flat files, Sql Server.
  • Extracted data from source systems, applied complex transformation and then loaded in the Target table
  • Designed Complex mapping flow using Source qualifier, Joiner, Aggregator, Expression, Lookup, Router, Filter, and Update Strategy transformations and Mapplets to load data into the target involving slowly changing dimensions.
  • Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
  • Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.
  • Effectively worked in Informatica version-based environment and used deployment groups to migrate the objects.
  • Enhanced existing UNIX shell scripts as part of the ETL process to schedule tasks/sessions.
  • Apply business rules using complex SQL and procedures
  • Worked on ETL strategy to store Data validation rules, Error handling methods to handle both expected and non-expected errors and documented it carefully.
  • Created Technical Specification documents based on high level requirement documents
  • Reworking if any discrepancies in the flat file extracts
  • Worked with MDM team and responsible for the Data Delivery to MDM process.
  • Responsible for all the data quality checks rules build for the project.
  • Representing for any data quality failure and finding the bottle neck.
  • Worked on DT to create the parser, Serializer to handle the XML and Non XML Files for the business.
  • Worked to Multiple Adhoc Request from Business and accommodate the Business request with effect on Daily routine.
  • Acted as a Data Analyst for any Data related issues reported.
  • Moving the code between development and System testing environments
  • Done Audit and Reconciliation for the data during SIT
  • Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting.
  • Performed unit testing at various levels of the ETL and actively involved in team code reviews.
  • Identified problems in existing production data and developed one time scripts to correct them.
  • Fixed the invalid mappings and troubleshoot the technical problems of the database.

Environment: Informatica Power Center 10.2,Informatica Power Center 10.1, Informatica MDM, Data transformation (Developer), IDQ, Oracle 11g/10g, DB2, Sql Server 2012, Tableau, Toad, Unix,Powershell.

Confidential

Sr ETL Informatica Developer

Responsibilities:

  • Requirement gathering, Business Analysis and documentation Functional, Technical, Integration Documents, low level and high level design documents.
  • Involved in Requirement analysis in support of Data Warehousing efforts and Data Lake Implementation.
  • Extensively worked with various Active transformations like Filter, Sorter, Aggregator, Router, SQL, Union and Joiner transformations.
  • Extensively worked with various Passive transformations like Expression, Lookup, Sequence Generator, Mapplet Input and Mapplet Output transformations.
  • Worked on Migration from Powercenter 9.6.1 to Powercenter 10.2 version.
  • Worked on Moving the data using Informatica BDE to the data lake using big data concepts.
  • Worked with source databases like Oracle, SQL Server, Teradata and Flat Files.
  • Extensively worked with Teradata utilities BTEQ, F-Load, M-load & TPT to load data into warehouse.
  • Work independently on the data migration tasks, backup and restore the database, data comparison between databases, schema comparison, executing the migration scripts etc.
  • Involved in Unit testing, Smoke test, Functional testing, user acceptance testing and system integration testing.
  • Created test plan, test data and executed test plan for business approvals.
  • Created complex mappings using Unconnected and Connected Lookup Transformations using different caches.
  • Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata
  • Experience working with Member, pharmacy, patient, provider, encounters and claim data
  • Experience handling company standard EDI files like 835, 837, HL-7, NCPDP files.
  • Experience in Building DT parser. Serializer to load the EDI files into DB.
  • Create parser framework to parse X12 XML file to target instance and Serializer to load to Non XML and generate the XML for outbound process.
  • Experience working with parsing for structured and unstructured files using Informatica Powercenter.
  • Implemented Slowly changing dimension Type 1 and Type 2 for change data capture.
  • Used Informatica Push down Optimization (PDO) to push the transformation processing from the PowerCenter engine into the relational database to improve performance.
  • Thorough understanding on 835 Health care claim payment and remittance, HL-7 practitioner specialization unstructured file segments and repeating groups.
  • Experienced in Teradata Parallel Transporter (TPT). Used full PDO on Teradata and worked with different Teradata load operators.
  • Worked with various look up cache like Dynamic Cache, Static Cache, Persistent Cache, Re Cache from database and Shared Cache.
  • Worked on loading data into Hive and Impala system for data lake implementation.
  • Designed solution to process file handling/movement, file archival, file validation, file processing and file error notification steps using Informatica.
  • Worked extensively with update strategy transformation for implementing inserts and updates.
  • As per business we implemented Auditing and Balancing on the transactional sources so that every record read is either captured in the maintenance tables or wrote to Target tables.
  • Implemented Informatica push down optimization for utilizing the database resources for better performance.
  • Extensively used the tasks like email task to deliver the generated reports to the mailboxes and command tasks to write post session and pre-session commands.

Environment: Informatica Power Center 10.2, 9.6.1/9.x, Informatica BDE 9.x, Oracle 11g/10g, DB2, Sql Server 2012, Hive, Impala, Microstratergy, Teradata v16, Tableau, Toad, Unix.

We'd love your feedback!