We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

4.00/5 (Submit Your Rating)

Houston, TX

SUMMARY:

  • Around 6+ years of IT experience with expertise in analysis, design, development and implementation of Data warehouses, data marts and Decision Support Systems (DSS) using ETL tools with RDBMS like Oracle, MS SQL server, Teradata, DB2 and Snowflake cloud databases on Windows and UNIX platforms.
  • Experience in Informatica PowerCenter 10.0/9.x/8.x and last 4 years using PowerExchange 10.2/9.x/8. x.
  • Extensive exposure in overall SDLC including requirement gathering, development, testing, debugging, deployment, documentation, production support.
  • Strong experience with Informatica tools using real - time CDC (change data capture) and MD5.
  • Experience in integration of various data sources like Oracle, Teradata, Mainframes, SQL server, XML, Flat files, JSON and extensive experience on Oracle, Teradata and Snowflake.
  • Experience with Snowflake cloud data warehouse and AWS S3 bucket for integrating data from multiple source system.
  • Very strong in Data Warehousing Concepts like Dimensions Type I, II and III, Facts, Surrogate keys, ODS, staging area, cube also well versed in Ralph Kimball and Bill Inmon Methodologies.
  • Realistic understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling.
  • Expert in writing complex SQL queries, PL/SQL and optimizing the queries in Oracle, SQL Server and Teradata also excellent in working with Views, Synonyms, Indexes, Partitioning, Database Joins, stored procedure, Stats and Optimization.
  • Well-versed in tuning Teradata ETL queries, remediating excessive statistics, resolving spool space issues, applying compression for space reclamation etc.
  • Expertise in SQL and PL/SQL programming and also excellent in Views, Analytical Functions, Stored Procedures, Functions and Triggers.
  • Experience in developing very complex mappings, reusable transformations, sessions and workflows using Informatica ETL tool to extract data from various sources and load into targets.
  • Experience in tuning and scaling the procedures for better performance by running explain plan and using different approaches like hint and bulk load.
  • Experience in Performance tuning of ETL process using pushdown optimization and other techniques. Reduced the execution time for huge volumes of data for a company merger projects. Heavily created mapplets, user defined functions, reusable transformations, look-ups.
  • Technical expertise in designing technical processes by using Internal Modeling & working with Analytical Teams to create design specifications; successfully defined & designed critical ETL processes, Extraction Logic, Job Control Audit Tables, Dynamic Generation of Session Parameter File, File Mover Process, etc.
  • Expertise in Teradata utilities like Multi Load, Fast Load, Fast Export, BTEQ, TPUMP, TPT and tools like SQL Assistant, Viewpoint
  • Expert in T-SQL coding and testing: functions, views, triggers, cursors, dictionary, stored procedures etc.
  • Assisted the other ETL developers in solving complex scenarios and coordinated with source systems owners with day-to-day ETL progress monitoring.
  • Skilled in creating & maintaining ETL Specification Documents, Use Cases, Source to target mapping, Requirement Traceability Matrix, performing Impact Assessment and providing Effort estimates, deployment artifacts
  • Deftly executed multi-resource projects following Onsite Offshore model while serving as a Mentor for the Junior Team Members
  • Strong Knowledge of Hadoop Ecosystem (HDFS, Hive, HDFS, Impala, Hue, etc.)
  • Exposure of end to end SDLC and Agile methodology
  • Good communication and presentation skills, works well as an integral part of a team, as well as independently, intellectually flexible and adaptive to change.

TECHNICAL SKILLS:

Data Warehousing/ ETL Technology: Informatica PowerCenter 9.x/8.x, Informatica PowerExchange 10.0/9.x/8.x

Database: Oracle 11g/10g, IBM UD2 DB2, MS SQL Server / 2016, MS Access 2000, MySQL, PostgreSQL, Teradata 13/12/V2R5, Snowflake cloud data warehouse

Data modeling: Erwin 9.1/7.1

Languages: SQL, PL/SQL, XSD, XML, Unix shell scripting

Tools: Microsoft Visio, TOAD, Oracle SQL developer, WINSQL, WINSCP, Secure Shell Client. TOAD, OBIEE 10g, SQL Loader, MS Office, Smart FTP, Ultra Edit, Autosys, Control-M, HP Quality Center, MS Visio, AutosysOperating System: Windows, UNIX

Reports: Cognos 10.0 /9.0, QlikView, Tableau 9/10

Methodologies: Ralph Kimball s Star Schema and Snowflake Schema.

Mythologies: SDLC, Agile

Others: MS Word, MS Access, MS Office, GitHub, HDFS, Hive, HDFS, Impala, Hue

PROFESSIONAL EXPERIENCE:

Confidential, Houston, TX

Sr. Informatica Developer

Responsibilities:

  • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
  • Actively involved in interacting with business users to record user requirements and Business Analysis.
  • Translated requirements into business rules & made recommendations for innovative IT solutions.
  • Outlined the complete process flow and documented the data conversion, integration and load mechanisms to verify specifications for this data migration project.
  • Parsing high-level design spec to simple ETL coding and mapping standards.
  • Worked with PowerCenter Designer tools in developing mappings and Mapplets to extract and load the data from flat files and Oracle database.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Created the design and technical specifications for the ETL process of the project.
  • Used Informatica as an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.
  • Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups Using Connected, Unconnected, Expression, Aggregator, Update strategy & stored procedure transformation.
  • Created mapping to load data into AWS S3 bucket using Informatica S3 connector also populated data into Snowflake from S3 bucket using complex SQL query.
  • Loaded diverse types (Structured, JSON, flat files etc.) into the Snowflake cloud data warehouse.
  • Worked on Informatica PowerCenter tool - Source Analyzer, Data Warehousing Designer, Mapping Designer & Mapplets, and Transformations.
  • Worked with various complex mapping, designed slowly changing dimension Type1 and Type2.
  • Maintained Development, Test and Production Mappings, migration using Repository Manager. Involved in enhancements and Maintenance activities of the data warehouse.
  • Used Teradata utility like BTEQ, FLOAD MLOAD and TPUMP scripts to load the data into Teradata tables.
  • Performance tuning of the process at the mapping level, session level, source level, and the target level.
  • Implemented various new components like increasing the DTM Buffer Size, Database Estimation, Incremental Loading, Incremental aggregation, Validation Techniques, and load efficiency.
  • Created Workflows containing command, email, session, decision and a wide variety of tasks.
  • Tuning the mappings based on criteria, creating partitions in case of performance issues.
  • Tested End to End to verify the failures in the mappings using shell scripts.
  • Performed data validation after the successful End to End tests and appropriate error handling in ETL processes.
  • Resolving the tickets based on the priority levels raised by QA team.
  • Developed Parameter files for passing values to the mappings for each type of client
  • Scheduled batch and sessions within Informatica using Informatica scheduler and also wrote shell scripts for job scheduling.
  • Understanding the entire functionality and major algorithms of the project and adhering to the company testing process.

Environment: Informatica PowerCenter 10.2, Cognos 10.2, Tableau 10.2, Linux, SQL, PL/SQL, Oracle 11g, TOAD, Snowflake cloud database, AWS S3 Bucket, Teradata 15, PostgreSQL 9.2, SQL Server 2012, Control M, Shell Scripting.

Confidential , Orlando, FL

Informatica Developer

Responsibilities:

  • Worked with the business team to gather requirements for projects and created strategies to handle the requirements.
  • Worked on project documentation which included the Functional, Technical and ETL Specification documents.
  • Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
  • Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.
  • Involved in extracting the data from the Flat Files and Relational databases into staging area.
  • Mappings, Sessions, Workflows from Development to Test and then to UAT environment.
  • Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.
  • Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.
  • Created Sessions and extracted data from various sources, transformed data according to the requirement and loading into data warehouse.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Router and Aggregator to create robust mappings in the Informatica Power Center Designer.
  • Imported various heterogeneous files using Informatica Power Center 8.x Source Analyzer.
  • Developed several reusable transformations and mapplets that were used in other mappings.
  • Prepared Technical Design documents and Test cases.
  • Involved in Unit Testing and Resolution of various Bottlenecks came across.
  • Implemented various Performance Tuning techniques.
  • Worked with Cognos team to build the data ware house.
  • Generated matrix reports, drill down, drill through, sub reports, chart reports, multi parameterized reports.
  • Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Joiner, Update Strategy, Lookup, Sorter, Expression, Router, Filter, Aggregator and Sequence Generator transformations.
  • Used SQL tools like TOAD to run SQL queries to view and validate the data loaded into the warehouse.
  • Performed data integration and lead generation from Informatica cloud into Salesforce cloud.
  • Created summarized tables, control tables, staging tables to improve the system performance and as a source for immediate recovery of Teradata database
  • Extracted the Salesforce CRM information into BI Data Warehouse using Force.com API/Informatica on Demand to provide integration with oracle financial information to perform advanced reporting and analysis.
  • Created Stored Procedures to transform the Data and worked extensively in T-SQL, PL/SQL for various needs of the transformations while loading the data into Data warehouse.
  • Developed transformation logic as per the requirement, created mappings and loaded data into respective targets.
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
  • Extensively worked on Unit testing for the Informatica code using SQL Queries and Debugger.
  • Used PMCMD command to start, stop and ping server from UNIX and created UNIX Shell scripts to automate the process.
  • Used PMCMD command to run workflows from command line interface.
  • Improved performance testing in Mapping and the session level.
  • Worked with UNIX shell scripts extensively for job execution and automation.
  • Coordinated with Autosys team to run Informatica jobs for loading historical data in production.
  • Created XML, Autosys JIL for the developed workflows.
  • Documented Data Mappings/ Transformations as per the business requirement.
  • Migration of code from development to Test and upon validation to Pre-Production and Production environments.
  • Provided technical assistance to business program users, and developed programs for business and technical applications.

Environment: Informatica PowerCenter 9.6, Informatica PowerExchange 9.6, SQL Server 2008, Shell Scripts, SQL, PL/SQL, UNIX, Toad, SQL Developer, HP Quality Center, Cognos 9, T-SQL

Confidential

ETL Informatica Developer

Responsibilities:

  • Experienced in using Informatica for data profiling and data cleansing, applying rules and develop mappings to move data from source to target systems
  • Designed and implemented ETL mappings and processes as per the company standards, using Informatica PowerCenter.
  • Extensively worked on complex mappings which involved slowly changing dimensions.
  • Developed several complex mappings in Informatica a variety of PowerCenter transformations, Mapping
  • Worked extensively on Informatica transformations like Source Qualifier, Expression, Filter, Router, Aggregator, Lookup, Update strategy, Sequence generator and Joiners.
  • Debugged mappings by creating a logic that assigns a severity level to each error, and sent the error rows to error table so that they can be corrected and re-loaded into a target system.
  • Deployed reusable transformation objects such as Mapplets to avoid duplication of metadata, reducing the development time.
  • Worked on developing Change Data Capture (CDC) mechanism using Informatica PowerExchange for some of the interfaces based on the requirements and limitations of the Project.
  • Implemented performance and query tuning on all the objects of Informatica using SQL Developer.
  • Worked in the ETL Code Migration Process from DEV to ITE, QA and to PRODUCTION.
  • Created the design and technical specifications for the ETL process of the project.
  • Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups Using Connected, Unconnected, Expression, Aggregator, Update strategy & stored procedure transformation.
  • Worked with SQL*Loader to load data from flat files obtained from various facilities.
  • Worked on loading of data from several flat files to Staging using Teradata MLOAD, FLOAD and BTEQ.
  • Worked with the Release Management Team for the approvals of the Change requests, Incidents using BMC Remedy Incident tool.
  • Worked with the infrastructure team to make sure that the deployment is up-to-date.
  • Provided 24x7 production support when necessary.

Environment: Informatica PowerCenter 8.6, Oracle 11g, SQL, Erwin 8.1, UNIX CRONTAB, Control-M, Remedy Incident Tool, Teradata.

We'd love your feedback!