Sr. Informatica Developer/ Data Analyst Resume
Fort Worth, TX
SUMMARY:
- Over 9+ years of IT experience with expertise in analysis, design, development and implementation of Data warehouses, data marts and Decision Support Systems (DSS) using ETL tools with RDBMS like Oracle, MS SQL server, DB2, databases on Windows and UNIX platforms.
- Over 9+ years of experience in Informatica PowerCenter 9.x/8.x & 4 years using PowerExchange 9.x/8. x.
- Extensive exposure in overall SDLC including requirement gathering, development, testing, debugging, deployment, documentation, production support.
- Strong experience with Informatica tools using real - time CDC (change data capture) and MD5.
- Experience in integration of various data sources like Oracle, Teradata, Netezza, Mainframes, SQL server, XML, Flat files and extensive knowledge on Oracle, Teradata, Netezza and MS Access.
- Very strong in Data Warehousing Concepts like Dimensions Type I, II and III, Facts, Surrogate keys, ODS, Staging area, cube also well versed in Ralph Kimball and Bill Inmon Methodologies.
- Realistic understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling.
- Superior SQL Skills and ability to write and interpret complex SQL statements and also mentor developers on SQL Optimization.
- Expert in writing SQL queries and optimizing the queries in Oracle, SQL Server and Teradata. Good understanding of Views, Synonyms, Indexes, Partitioning, Database Joins, Stats and Optimization.
- Experience in developing very complex mappings, reusable transformations, sessions and workflows using Informatica ETL tool to extract data from various sources and load into targets.
- Experience in tuning and scaling the procedures for better performance by running explain plan and using different approaches like hint and bulk load.
- Experience in Performance tuning of ETL process using pushdown optimization and other techniques. Reduced the execution time for huge volumes of data for a company merger projects. Heavily created mapplets, user defined functions, reusable transformations, look-ups.
- Expertise in SQL and PL/SQL programming and also excellent in Views, Analytical Functions, Stored Procedures, Functions and Triggers.
- Experience in designing and development of Variable Length EBCDIC VSAM Files, Cobol Copybooks using Informatica PowerExchange 9.6/9.1/8.6
- Technical expertise in designing technical processes by using Internal Modeling & working with Analytical Teams to create design specifications; successfully defined & designed critical ETL processes, Extraction Logic, Job Control Audit Tables, Dynamic Generation of Session Parameter File, File Mover Process, etc.
- Experience with Teradata utilities FastLoad, MultiLoad, BTEQ scripting, FastExport, SQL Assistant.
- Deftly executed multi-resource projects following Onsite Offshore model while serving as a Mentor for the Junior Team Members
- Excellent communication and presentation skills, works well as an integral part of a team, as well as independently, intellectually flexible and adaptive to change.
- Exposure in Waterfall methodologies and Agile methodologies with Scum.
TECHNICAL SKILLS:
ETL Tools: Informatica PowerCenter 9.x/8.x, Informatica PowerExchange 9.x/8.x, Informatica DataQuality 9.x/8.x, Talend, IBM DataStage 11.3
Databases: Oracle 11g/10g, IBM UD2 DB2, MS SQL Server 2008 / 2012, MS Access 2000, Teradata 13/12/V2R5, Netezza 9
Programming Languages: C, C++, SQL, PL/SQL, UNIX, XML, Unix Shell Scripts
DB Tools: TOAD, SQL Navigator, SQL* Loader, SQL Management Studio, SQL Assistance
Mythologies: SDLC, Agile
Others: Erwin 5.1/4.1.2, OBIEE 10g, SQL Loader, MS Office, Smart FTP, Ultra Edit, Autosys, Control-M, HP Quality Center, MS Visio, Erwin 9.3, Business Objects 6.0
Operating Systems: Sun Solaris 8.0/10.0, Windows XP/7, AIX, LINUX
PROFESSIONAL EXPERIENCE:
Confidential, Fort Worth, TX
Sr. Informatica Developer/ Data Analyst
Responsibilities:
- Actively involved in interacting with business users to record user requirements and Business Analysis.
- Translated requirements into business rules & made recommendations for innovative IT solutions.
- Outlined the complete process flow and documented the data conversion, integration and load mechanisms to verify specifications for this data migration project.
- Parsing high-level design spec to simple ETL coding and mapping standards.
- Worked with PowerCenter Designer tools in developing mappings and Mapplets to extract and load the data from flat files and Oracle database.
- Maintained warehouse metadata, naming standards and warehouse standards for future application development.
- Created the design and technical specifications for the ETL process of the project.
- Used Informatica as an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.
- Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups Using Connected, Unconnected, Expression, Aggregator, Update strategy & stored procedure transformation.
- Worked on Informatica PowerCenter tool - Source Analyzer, Data Warehousing Designer, Mapping Designer & Mapplets, and Transformations.
- Worked with slowly changing dimension Type1, Type2, and Type3.
- Maintained Development, Test and Production Mappings, migration using Repository Manager. Involved in enhancements and Maintenance activities of the data warehouse.
- Performance tuning of the process at the mapping level, session level, source level, and the target level.
- Utilized Informatica IDQ to complete the initial data profiling and matching/removing duplicate data for the process of data migration from the legacy systems to the target Oracle Database.
- Implemented various new components like increasing the DTM Buffer Size, Database Estimation, Incremental Loading, Incremental aggregation, Validation Techniques, and load efficiency.
- Strong on Exception Handling Mappings for Data Quality, Data Cleansing and Data Validation.
- Worked with SQL*Loader to load data from flat files obtained from various facilities.
- Created Workflows containing command, email, session, decision and a wide variety of tasks.
- Developed Parameter files for passing values to the mappings for each type of client
- Scheduled batch and sessions within Informatica using Informatica scheduler and also wrote shell scripts for job scheduling.
- Understanding the entire functionality and major algorithms of the project and adhering to the company testing process.
- Review SQL queries, Create Data Mapping Documents and work with Data Modeling team for analysis and documentation.
- Review UAT Defect log, analyze the issue, update the issue log in ALM (aka defect management tool) and work with SIT team for resolution, retest the fix it and assign it to UAT team
- Cleaned old financial data and uploaded clean data by using Excel and CSV Files.
- Creating Data Lineage document that includes Functions, Stored Procedures, and BTEQ Scripts for data modeling team.
- Responsible for collecting and analyzing metadata / data lineage for risk and regulatory reporting.
- Created automation test plan and reviewed manual test cases for automation.
- Experience building a metadata model using a data governance tool that will be used to establish a data.
- Involved in design and creation of automation framework.
- Extracted data from the databases (Oracle and SQL Server, DB2, Flat Files, Excel Files) Using informatica to load into a single data warehouse repository.
- Responsible for reverse engineering the existing ETL code for home-grown systems which are going to be incorporated into Fusion Data Documents.
- Coordinate with UAT team for creating additional test scenarios based on data model.
- Responsible of checking data from Source system with Nullable and Non Nullable fields.
- Experience with data quality, Metadata and ETL tools.
- Experience in supporting team for 24 x 7 on call for any DB2 issue and able to create ticket resolution.
- Responsible for verifying data types within the table and then for the same field across tables in the database.
Environment: Informatica PowerCenter 9.6.1, Informatica PowerExchange 9.6.1, Informatica DataQuality 9.6.1, Cognos 9.0, Sun Solaris, SQL, PL/SQL, Oracle 11g, TOAD, SQL Server 2012, Autosys, Shell Scripting, XML, SQL Loader
Confidential, Irving, TXInformatica Developer/ Data Analyst
Responsibilities:
- Responsible for design and development of Salesforce Data Warehouse migration project leveraging Informatica PowerCenter ETL tool.
- Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Joiner, Update Strategy, Lookup, Sorter, Expression, Router, Filter, Aggregator and Sequence Generator etc.,
- Used SQL tools like TOAD to run SQL queries to view and validate the data loaded into the warehouse.
- Performed data integration and lead generation from Informatica cloud into Salesforce cloud.
- Created summarized tables, control tables, staging tables to improve the system performance and as a source for immediate recovery of Teradata database
- Extracted the Salesforce CRM information into BI Data Warehouse using Force.com API/Informatica on Demand to provide integration with oracle financial information to perform advanced reporting and analysis.
- Created Stored Procedures to transform the Data and worked extensively in T-SQL, PL/SQL for various needs of the transformations while loading the data into Data warehouse.
- Developed transformation logic as per the requirement, mappings and loaded data into respective targets.
- Used pmcmd command to run workflows from command line interface.
- Responsible for the data management and data cleansing activities using Informatica data quality (IDQ).
- Worked with Informatica Cloud Data Loader for Salesforce, for reducing the time taken to import or export critical business information between Salesforce CRM, Force.com.
- Performed data quality analysis to validate the input data based on the cleansing rules.
- Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
- Extensively worked on Unit testing for the Informatica code using SQL Queries and Debugger.
- Used the sandbox for testing to ensure minimum code coverage for the application to be migration.
- Used PMCMD command to start, stop and ping server from UNIX and created UNIX Shell scripts to automate the process.
- Improved performance testing in Mapping and the session level.
- Worked with UNIX shell scripts extensively for job execution and automation.
- Coordinated with Autosys team to run Informatica jobs for loading historical data in production.
- Documented Data Mappings/ Transformations as per the business requirement.
- Created XML, Autosys JIL for the developed workflows.
- Extensively involved in code deployment from Dev to Testing.
Environment: Informatica PowerCenter 8.6, Informatica, SQL Server 2008, Oracle 10g, Shell Scripts, Teradata 13, SQL, PL/SQL, UNIX, Toad, SQL Developer, HP Quality Center, Professional Sewxxw
Confidential, San Antonio, TXInformatica Developer/ Data Analyst
Responsibility:
- Interacted with business analyst to understand the business requirements.
- Involved in gathering requirements from business users.
- Extracted source data using Informatica tools and stored procedures from source systems.
- Developed transformation logic and designed various complex mappings and mapplets
- Designed various mappings using transformations like Look Up, Router, Update Strategy, Filter, Sequence Generator, Joiner, Aggregator, and Expression Transformation.
- Used Mapplets, parameters and variables to implement object orientation techniques and facilitate the reusability of code.
- Used Workflow Manager to create and run batches for different applications.
- Developed UNIX scripts as pre/post-session commands to schedule loads, through SQL-loader utility.
- Configured and ran the Debugger from within the Mapping Designer to troubleshoot predefined mapping.
- Involved in Fine-tuning the Informatica code (mapping and sessions)
- Stored Procedures, SQL to obtain optimal performance and throughput.
- Used various techniques like Sectioning, Ranking, Filtering, Slice and Dice to develop the reports.
Environment: Informatica PowerCenter 8.1, Business Objects 6.0, Oracle 10g, SQL Loader, PL/SQL, DB2, UNIX Shell Programming, Linux and Windows NT