We provide IT Staff Augmentation Services!

Informatica Etl Developer/data Analyst Resume

2.00/5 (Submit Your Rating)

Plano, TX

SUMMARY:

  • 7 + Years of professional experience working as a ETL Informatica Developer /Data Analyst with emphasis on Data Mapping, Data Validation, and Requirement gathering in Data Warehousing Environment.
  • Worked in Agile and Waterfall model, proficient in Software Development Life Cycle (SDLC) which include requirement gathering, development, testing, debugging, deployment, documentation, production support.
  • Expertise in Analysis, Design, Development, Implementation, Modeling, Testing and support for Data warehousing & OLAP applications.
  • More than 6 years of experience in Informatica PowerCenter 10.0/9.x/8.x and last 4 years using PowerExchange 10.0/9.x/8.x
  • Strong experience with Informatica tools using real - time CDC (change data capture) and MD5.
  • Experience in integration of various data sources like SAP, Oracle, Teradata, Netezza, Mainframes, SQL server, XML, Flat files and extensive knowledge on Oracle, Teradata, Netezza and MS Access.
  • Experience with Snowflake cloud data warehouse and AWS S3 bucket for integrating data from multiple source system which include loading nested JSON formatted data into snowflake table.
  • Experience in Hadoop Ecosystem using HDFS, SparkSQL and Hive
  • Implemented slowly changing dimension (SCD) type I and type II in dimensional modeling as per requirements. Also Very strong in Data Warehousing Concepts like Dimensions Facts, Surrogate keys, ODS, Staging area, cube also well versed in Ralph Kimball and Bill Inmon Methodologies.
  • Realistic understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling.
  • Expert in writing pseudo code, SQL queries and optimizing the queries in Oracle, SQL Server and Teradata. Good understanding of Views, Synonyms, Indexes, Partitioning, Database Joins, Stats and Optimization.
  • Experience in developing very complex mappings, reusable transformations, sessions and workflows using Informatica ETL tool to extract data from various sources and load into targets.
  • Experience in tuning and scaling the procedures for better performance by running explain plan and using different approaches like hint and bulk load.
  • Experience in Performance tuning of ETL process using pushdown optimization and other techniques. Reduced the execution time for huge volumes of data for a company merger projects. Heavily created mapplets, user defined functions, reusable transformations, look-ups.
  • Expertise in SQL and PL/SQL programming and also excellent in Views, Analytical Functions, Stored Procedures, Functions and Triggers.
  • Experience in designing and development of Variable Length EBCDIC VSAM Files, Cobol Copybooks using Informatica PowerExchange 10.0/9.x/8.x
  • Experience in IDQ development around data profiling, cleansing, parsing, standardization, verification, matching and data quality exception monitoring and handling
  • Technical expertise in designing technical processes by using Internal Modeling & working with Analytical Teams to create design specifications; successfully defined & designed critical ETL processes, Extraction Logic, Job Control Audit Tables, Dynamic Generation of Session Parameter File, File Mover Process, etc.
  • Experience in Teradata RDBMS using BTEQ scripting, FastLoad, FastExport, MultiLoad, TPump, Teradata SQL Assistance and BTEQ Teradata utilities.
  • Assisted the other ETL developers in solving complex scenarios and coordinated with source systems owners with day-to-day ETL progress monitoring.
  • Experience in writing UNIX shell scripts for Data validations, Data cleansing etc.
  • Worked in various verticals such as Banking and Financial.
  • Deftly executed multi-resource projects following Onsite Offshore model while serving as a Mentor for the Junior Team Members
  • Excellent communication and presentation skills, works well as an integral part of a team, as well as independently, intellectually flexible and adaptive to change.

TECHNICAL SKILLS:

ETL Tools: Informatica PowerCenter 10.0/9.x/8.x, Informatica PowerExchange 9.x/8.x, Informatica DataQuality 10.0/9.x/8.x

Database: Oracle 11g/10g, IBM UD2 DB2, MS SQL Server 2008 / 2012, MS Access 2000, Teradata 14/13/V2R5, Snowflake cloud data warehouse

Data modeling: Erwin 9.1/7.1, Visio

Languages: SQL, PL/SQL, XSD, XML, Unix shell scripting

Tools: AWS S3 bucket, Microsoft Visio, TOAD, Oracle SQL developer, WINSQL, WINSCP, Secure Shell Client. TOAD, SQL Loader, MS Office, Smart FTP, Ultra Edit, Autosys, Control-M, HP Quality Center, MS Visio, AWS EC2

Operating System: Windows, UNIX

Reports: MicroStrategy 9.x, Cognos 9.0, Crystal Reports

Methodologies: Ralph Kimball s Star Schema and Snowflake Schema.

Mythologies: SDLC, Agile

Others: MS Word, MS Access, MS Office, GitHub

PROFESSIONAL EXPERIENCE:

Confidential, Plano TX

Informatica ETL Developer/Data Analyst

Responsibilities:

  • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
  • Actively involved in interacting with business users to record user requirements and Business Analysis.
  • Involved in Analysis, profiling and cleansing of source data and understanding the business process.
  • Translated requirements into business rules & made recommendations for innovative IT solutions.
  • Outlined the complete process flow and documented the data conversion, integration and load mechanisms to verify specifications for this data migration project.
  • Involved in documentation of Data Mapping & ETL specifications for development from source to target.
  • Also implemented the change Data Capture (CDC) while integrating the enterprise data sources.
  • Parsing high-level design spec to simple ETL coding and mapping standards.
  • Worked with PowerCenter Designer tools in developing mappings and Mapplets to extract and load the data from flat files and Oracle database.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Performed bulk load of JSON data from s3 bucket to snowflake.
  • Used Snowflake functions to perform semi structures data parsing entirely with SQL statements.
  • Created the design and technical specifications for the ETL process of the project.
  • Used Informatica as an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.
  • Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups Using Connected, Unconnected, Expression, Aggregator, Update strategy & stored procedure transformation.
  • Worked on Informatica PowerCenter tool - Source Analyzer, Data Warehousing Designer, Mapping Designer & Mapplets, and Transformations.
  • Worked with slowly changing dimension Type1, Type2, and Type3.
  • Maintained Development, Test and Production Mappings, migration using Repository Manager. Involved in enhancements and Maintenance activities of the data warehouse.
  • Performance tuning of the process at the mapping level, session level, source level, and the target level.
  • Utilized Informatica IDQ to complete the initial data profiling and matching/removing duplicate data for the process of data migration from the legacy systems to the target Oracle Database.
  • Implemented various new components like increasing the DTM Buffer Size, Database Estimation, Incremental Loading, Incremental aggregation, Validation Techniques, and load efficiency.
  • Strong on Exception Handling Mappings for Data Quality, Data Cleansing and Data Validation.
  • Worked with SQL*Loader to load data from flat files obtained from various facilities.
  • Created Workflows containing command, email, session, decision and a wide variety of tasks.
  • Tuning the mappings based on criteria, creating partitions in case of performance issues.
  • Tested End to End to verify the failures in the mappings using scripts.
  • Performed data validation after the successful End to End tests and appropriate error handling in ETL processes.
  • Resolving the tickets based on the priority levels raised by QA team.
  • Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT)
  • Developed Parameter files for passing values to the mappings for each type of client
  • Scheduled batch and sessions within Informatica using Informatica scheduler and also wrote shell scripts for job scheduling.

Environment: Informatica PowerCenter 10.0, Informatica PowerExchange 10.0, Informatica DataQuality 10.0, Cognos 9.0, Linux, SQL, PL/SQL, Oracle 11g, TOAD, Snowflake cloud data warehouse, AWS S3 bucket, JSON, SQL Server 2012, HDFS, Hive, SparkSQL, Control M, Shell Scripting, XML, SQL Loader, Putty, WinSCP

Confidential, NY

SQL Server Developer/ ETL Developer

Responsibilities:

  • Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
  • Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.
  • Involved in extracting the data from the Flat Files and Relational databases into staging area.
  • Mappings, Sessions, Workflows from Development to Test and then to UAT environment.
  • Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.
  • Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.
  • Created Sessions and extracted data from various sources, transformed data according to the requirement and loading into data warehouse.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Router and Aggregator to create robust mappings in the Informatica Power Center Designer.
  • Imported various heterogeneous files using Informatica Power Center 8.x Source Analyzer.
  • Developed several reusable transformations and mapplets that were used in other mappings.
  • Prepared Technical Design documents and Test cases.
  • Involved in Unit Testing and Resolution of various Bottlenecks came across.
  • Implemented various Performance Tuning techniques.
  • Used Teradata as a source system
  • Worked with Cognos team to build the data ware house.
  • Generated matrix reports, drill down, drill through, sub reports, chart reports, multi parameterized reports.
  • Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Joiner, Update Strategy, Lookup, Sorter, Expression, Router, Filter, Aggregator and Sequence Generator transformations.
  • Used SQL tools like TOAD to run SQL queries to view and validate the data loaded into the warehouse.
  • Performed data integration and lead generation from Informatica cloud into Salesforce cloud.
  • Created summarized tables, control tables, staging tables to improve the system performance and as a source for immediate recovery of Teradata database
  • Extracted the Salesforce CRM information into BI Data Warehouse using Force.com API/Informatica on Demand to provide integration with oracle financial information to perform advanced reporting and analysis.
  • Created Stored Procedures to transform the Data and worked extensively in T-SQL, PL/SQL for various needs of the transformations while loading the data into Data warehouse.
  • Developed transformation logic as per the requirement, created mappings and loaded data into respective targets.
  • Used pmcmd command to run workflows from command line interface.
  • Responsible for the data management and data cleansing activities using Informatica data quality (IDQ).
  • Worked with Informatica Cloud Data Loader for Salesforce, for reducing the time taken to import or export critical business information between Salesforce CRM, Force.com.
  • Performed data quality analysis to validate the input data based on the cleansing rules.
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
  • Extensively worked on Unit testing for the Informatica code using SQL Queries and Debugger.
  • Used the sandbox for testing to ensure minimum code coverage for the application to be migrated to production.
  • Used PMCMD command to start, stop and ping server from UNIX and created UNIX Shell scripts to automate the process.
  • Worded on profiling source data and determining all source data posible values and metadata characteristics.
  • Wordked on desiging and executing a Data Quality Audit/Assessment, data quality mappings that cleanse, de-duplicate, and otherwise prepare the project data
  • Worked on implementing data quality processes including transliteration, parsing, analysis, standardization and enrichment at point of entry and batch modes.
  • Improved performance testing in Mapping and the session level.
  • Worked with UNIX shell scripts extensively for job execution and automation.
  • Coordinated with Autosys team to run Informatica jobs for loading historical data in production.
  • Documented Data Mappings/ Transformations as per the business requirement.
  • Created XML, Autosys JIL for the developed workflows.
  • Migration of code from development to Test and upon validation to Pre-Production and Production environments.
  • Provided technical assistance to business program users, and developed programs for business and technical applications.

Environment: Informatica PowerCenter 9.1, Informatica PowerExchange 9.1, SQL Server 2012, Shell Scripts, Teradata 13, SQL, PL/SQL, UNIX, Toad, SQL Developer, HP Quality Center, T-SQL, MicroStrategy 9.2

Confidential

ETL Developer

Responsibilities:

  • Worked with the business team to gather requirements for projects and created strategies to handle the requirements.
  • Worked on project documentation which included the Functional, Technical and ETL Specification documents.
  • Experienced in using Informatica for data profiling and data cleansing, applying rules and develop mappings to move data from source to target systems
  • Designed and implemented ETL mappings and processes as per the company standards, using Informatica PowerCenter.
  • Extensively worked on complex mappings which involved slowly changing dimensions.
  • Developed several complex mappings in Informatica a variety of PowerCenter transformations, Mapping
  • Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using both the Informatica PowerCenter and IDQ.
  • Worked extensively on Informatica transformations like Source Qualifier, Expression, Filter, Router, Aggregator, Lookup, Update strategy, Sequence generator and Joiners.
  • Debugged mappings by creating a logic that assigns a severity level to each error, and sent the error rows to error table so that they can be corrected and re-loaded into a target system.
  • Deployed reusable transformation objects such as Mapplets to avoid duplication of metadata, reducing the development time.
  • Worked on developing Change Data Capture (CDC) mechanism using Informatica PowerExchange for some of the interfaces based on the requirements and limitations of the Project.
  • Implemented performance and query tuning on all the objects of Informatica using SQL Developer.
  • Worked in the ETL Code Migration Process from DEV to ITE, QA and to PRODUCTION.
  • Created the design and technical specifications for the ETL process of the project.
  • Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups Using Connected, Unconnected, Expression, Aggregator, Update strategy & stored procedure transformation.
  • Worked with SQL*Loader to load data from flat files obtained from various facilities.
  • Worked on loading of data from several flat files to Staging using Teradata MLOAD, FLOAD and BTEQ.
  • Worked with the Release Management Team for the approvals of the Change requests, Incidents using BMC Remedy Incident tool.
  • Worked with the infrastructure team to make sure that the deployment is up-to-date.
  • Provided 24x7 production support when necessary.

Environment: Informatica PowerCenter 8.6, Informatica IDQ 8.6, Informatica PowerExchange 8.6, Oracle 11g, SQL, Erwin 5, UNIX CRONTAB, Control-M, Remedy Incident Tool, Ultra Edit, Teradata 13

Confidential

Informatica Developer

Responsibilities:

  • Logical and Physical data modeling was done using Erwin for data warehouse database in STAR SCHEMA
  • Using Informatica PowerCenter Designer analyzed the source data to Extract & Transform from various source systems (oracle 10g, DB2, SQL server and flat files) by incorporating business rules using different objects and functions that the tool supports.
  • Using Informatica PowerCenter created mappings and mapplets to transform the data according to the business rules.
  • Used various transformations like Source Qualifier, Joiner, Lookup, sql, router, Filter, Expression and Update Strategy.
  • Implemented slowly changing dimensions (SCD) for some of the Tables as per user requirement.
  • Developed Stored Procedures and used them in Stored Procedure transformation for data processing and have used data migration tools
  • Documented Informatica mappings in Excel spread sheet.
  • Tuned the Informatica mappings for optimal load performance.
  • Have used BTEQ, FEXP, FLOAD, MLOAD Teradata utilities to export and load data to/from Flat files.
  • Created and Configured Workflows and Sessions to transport the data to target warehouse Oracle tables using Informatica Workflow Manager.
  • This role carries primary responsibility for problem determination and resolution for each SAP application system database server and application server.
  • Worked along with UNIX team for writing UNIX shell scripts to customize the server scheduling jobs.
  • Constantly interacted with business users to discuss requirements.

Environment: Informatica 8.6, Oracle 10g, SQL server 2005, SQL, T-SQL, PL/SQL, Toad, Erwin 4.x, Unix, Tortoise SVN, Flat files.

We'd love your feedback!