We provide IT Staff Augmentation Services!

Sr. Informatica Etl Developer Resume

2.00/5 (Submit Your Rating)

NyC

SUMMARY

  • 7+ years of IT experience with expertise in complete Software Development Life Cycle (SDLC) which includes business requirements gathering, system analysis, design, development and implementation of data warehouse.
  • Expertise in building Enterprise Data Warehouses (EDW), Operational Data Stores (ODS), Data Marts and Decision Support Systems (DSS).
  • Extensively used Informatica Client tools - Repository Manager, Designer, Workflow Manager and Workflow Monitor.
  • Sound knowledge in using Informatica PowerCenter 10.1/9.6/9x/8x (Source Analyzer, Mapplet Designer, Transformation Designer, Star & Snowflake Schemas using Informatica tool on Oracle, Netezza and Teradata databases).
  • Strong experience in RDBMS concepts, PL/SQL programming units like Procedures, Functions and Packages in UNIX and Windows environments.
  • Good proficiency in using databases like Oracle 11g/10g/9i/8i, Netezza, Teradata and Microsoft SQL Server 2012/2008.
  • Worked on Dimensional Data modelling in Star and Snowflake schemas and Slowly Changing Dimensions.
  • Worked extensively on Erwin in both OLAP and OLTP applications.
  • Proficient in extracting and transforming data from various sources (Oracle tables, Flat files, XML) into the Datawarehouse using ETL tools.
  • Extensively worked on transformations such as Source Qualifier, Expression, Aggregator, Filter, Router, Joiner, Lookup, Sorter, Normalizer, Sequence Generator, Stored Procedure, XML Generator, XML Parser and Update Strategy transformations.
  • Wide knowledge in Extraction, Transformation and Loading of data from multiple sources to the data warehouse.
  • Worked on multiple projects using Informatica Data Quality client tool (IDQ) latest versions 10.1 and 9.6.
  • Proficient using query tools like TOAD, SQL Developer, PL/SQL developer and Teradata SQL Assistant.
  • Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Hints and SQL Trace both in Teradata as well as Oracle.
  • Extensively involved in Performance issues of both Informatica and Databases.
  • Experience in design of logical and physical data modelling and also have expertise in using SQL Loader to load data from external files to Oracle database.
  • Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.
  • Good understanding of using Healthcare applications, Diagnose codes and EDI Transaction codes 837,835,270,271, etc.
  • Execute and monitor batch operations, monitor enterprise-level operations using job scheduler tools like BMC Control-M 8.0.0.
  • Excellent in verbal and communication skills, good in understanding business procedures and have ability to work as an individual or in a team.

TECHNICAL SKILLS

ETL tools: Informatica PowerCenter 10.1/9.6/9x/8x (Mapping designer, Workflow manager, Workflow monitor, Repository manager), SQL Server Integration Services (SSIS) 2014, DataStage 11.5

Data Modelling: CA Erwin Data Modeler 9.0, Logical data modelling, Physical data modelling, Dimensional modelling (Star and Snowflake schemas, Facts and Dimensions), Entities, Attributes

Databases: Oracle 11g/10g/9i, Netezza, Teradata 13.10, SQL Server 2008/R2/2012, MS Access, DB2

Database tools: TOAD for Oracle 12.8, PL/SQL developer 4.1.3, SQL Plus

Programming: SQL, PL/SQL, UNIX, XML, Hadoop MapReduce, Hive

Scheduling tools: BMC Control-M 8.0.0, IBM Tivoli Workload Scheduler 8.6

Reporting tools: Cognos 8.2/8.4, TIBCO Spotfire Platform 4.5

Defect tracking tools: HP Quality Center 10.0/9.0, JIRA 7.0.0, Microsoft TFS

Operating Systems: MS Windows NT/2000/2003/XP/Vista/7, UNIX, MS-DOS and Linux

Other tools: Microsoft Office Suite, Microsoft Visio

PROFESSIONAL EXPERIENCE

Confidential, NYC

Sr. Informatica ETL Developer

Responsibilities:

  • Involved in designing and customizing logical and physical data models for Data warehouse supporting data from multiple sources on real time.
  • Used Star Schema approach for designing of Data Warehouse using Erwin.
  • Extensively worked on Informatica Designer and Workflow Manager.
  • Extensively worked on performance tuning of Informatica and IDQ mappings.
  • Extensively used almost all of the transformations of Informatica including source qualifier, expression, filter, aggregator, rank, Lookups, Stored Procedures, sequence generator, joiner, Update Strategy and others.
  • Worked with Memory cache for improving throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations .
  • Developed Slowly Changing Dimension (SCD) mappings.
  • Developed Sessions, Worklets using Informatica Workflow Manager.
  • Extensively worked in the Performance tuning of the Programs, Procedures and Processes.
  • Translated business requirements into data warehouse design.
  • Customize shell scripts to run mapping in Control M
  • Perform analysis of large data sets using components from the Hadoop ecosystem
  • Worked closely with DBA and developers during planning, analyzing and testing phase of the project.
  • Running the SQL scripts from TOAD and creating Oracle Objects like tables, Views, Indexes, sequences and Users. SQL coding for problem resolution and performance optimization
  • Participated in identifying scope of the project, planning and development with other team members.
  • Analyzed, enhanced and maintained Data Warehouse systems using RDBMS database and Informatica tools.
  • Knowledge on Informatica Cloud Real Time preferred
  • Worked with various IT and management teams with the schedule.
  • Prepared test cases and performed Unit testing. Created test cases and test scenarios and documented actual results; involved in system testing of overall processes; compared actual results to expected results; and wrote Test Plans and created Test Cases effectively using Quality Center.

Environment: IICS, Informatica PowerCenter 10.2, Informatica PowerExchange 10.1, Oracle 12c/11g, DB2, Teradata 15, MSSQL Server 2012, IDQ 9.5, Autosys, Snowflake cloud databases, AWS S3 bucket, JSON, Hadoop, Erwin 9.2, Putty,Change Data Capture (CDC), Shell Scripting, Clearcase, Putty, WinSCP, Notepad++, JIRA, Control-M, Cognos 10.

Confidential, North Bergen, New Jersey

Senior ETL/Informatica Developer

Responsibilities:

  • Creation of Design and Technical Specifications for Data Warehouses and data marts based on the requirements.
  • Perform data analysis on Oracle, SQL Server, Netezza and SAP HANA databases to understand Source and Target Database structure and relationships.
  • Involving in Data model design discussions to create Logical and Physical data models.
  • Design interfaces to source systems and data feeds.
  • Creation of ETL designs, develop ETL (Extract, Transformation and Load) mappings and workflows to extract, load and transform the data from multiple systems involving Oracle, SQL Server, Netezza, SAP HANA databases, Flat files, XMLs, APIs using Informatica.
  • Creation of Batch Jobs designs and development of Batch Schedules using Cisco Tidal Job scheduler.
  • Performing design and build for database objects creation and modifications.
  • Creation of UNIX Shell Scripts to perform various Job handling, file handling and data validation tasks.
  • Designing and developing MFT (Managed File Transfer) interfaces between various systems for file handling.
  • SAP HANA repository management using SAP HANA Studio, HANA Application Life Cycle Management and SAP Change request Management (ChaRM).
  • System testing and performance testing of data loads at various intervals to ensure appropriate loads.
  • UAT Support, handling the production migration of ETL/Batch Job components and Warranty Support.

Confidential - Jersey City, NJ

Informatica ETL Developer

Responsibilities:

  • Worked with the business team to gather requirements for projects and created strategies to handle the requirements.
  • Worked on project documentation, which included the Functional, Technical and ETL Specification documents.
  • Experienced in using Informatica for data profiling and data cleansing, applying rules and develop mappings to move data from source to target systems
  • Designed and implemented ETL mappings and processes as per the company standards, using Informatica PowerCenter.
  • Extensively worked on complex mappings, which involved slowly changing dimensions.
  • Developed several complex mappings in Informatica a variety of PowerCenter transformations, Mapping
  • Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using both the Informatica PowerCenter and IDQ.
  • Worked extensively on Informatica transformations like Source Qualifier, Expression, Filter, Router, Aggregator, Lookup, Update strategy, Sequence generator and Joiners.
  • Debugged mappings by creating a logic that assigns a severity level to each error and sent the error rows to error table so that they can be corrected and re-loaded into a target system.
  • Deployed reusable transformation objects such as Mapplets to avoid duplication of metadata, reducing the development time.
  • Implemented performance and query tuning on all the objects of Informatica using SQL Developer.
  • Created the design and technical specifications for the ETL process of the project.
  • Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups Using Connected, Unconnected, Expression, Aggregator, Update strategy & stored procedure transformation.
  • Worked with SQL*Loader to load data from flat files obtained from various facilities.
  • Worked on loading of data from several flat files to Staging using Teradata MLOAD, FLOAD and BTEQ.
  • Worked with the Release Management Team for the approvals of the Change requests, Incidents using BMC Remedy Incident tool.
  • Worked with the infrastructure team to make sure that the deployment is up to date.

We'd love your feedback!