We provide IT Staff Augmentation Services!

Informatica Developer Resume

Washington, DC

SUMMARY:

  • Over 8+ years of IT experience in Planning, Analysis, Design, Implementation, Development, Maintenance and Support for production environment in different domains like Insurance, Healthcare, Financial, Retail with a strong conceptual background in Database development and Data warehousing.
  • Highly proficient in Development, Implementation, Administration and Support of ETL processes for Large - scale Data warehouses by Using Informatica Power Center 10.2.0/9.6/9.5/9.1/9.0/8. x.
  • Proficient in the Integration of various data sources with multiple relational databases like Oracle11g /Oracle10g/9i, MS SQL Server, DB2, Teradata, Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
  • Superior SQL skills and ability to write and interpret complex SQL statements and also skillful in SQL optimization and ETL debugging and performance tuning.
  • Experience in AWS (Amazon Web Services), S3 Bucket and Redshift (AWS Relational Database).
  • Slowly Changing Dimensions Management including Type 1, 2, 3, Hybrid Type 3, De-normalization, Cleansing, Conversion, Aggregation, Performance Optimization.
  • Extensively experience in developing Informatica Mappings / Mapplets using various Transformations for Extraction, Transformation and Loading of data from Multiple Sources to Data Warehouse and Creating Workflows with Worklets & Tasks and Scheduling the Workflows.
  • Experience in working with Designer, Work Flow Manager, Work Flow Monitor, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Designer, Task Developer, Worklet Designer, Gantt Chart, Task View, Mapplets, Mappings, Workflows, Sessions, Re-usable Transformations, Shortcuts, Import and Export utilities.
  • Experience in Data Warehouse development working with Extraction/Transformation/Loading using Informatica Power Mart/Power Center with flat files, Oracle, SQL Server, and Teradata.
  • Experience working on Data quality tools Informatica IDQ 9.1.
  • Experience working in multi-terabytes data warehouse using Databases like Oracle 11g/10g/9i, MS Access 2000/2002, XML, IBM UDB DB2 8.2, SQL Server 2008, MS Excel and Flat files.
  • Experience Relational Modeling and Dimensional Data Modeling using Star Snow Flake schema, De normalization, Normalization, and Aggregations.
  • Very strong in SQL and PL/SQL, extensive hands on experience in creation of database tables, triggers, sequences, functions, procedures, packages, and SQL performance-tuning.
  • Have Good understanding of ETL/Informatica standards and best practices, Slowly Changing Dimensions SCD1, SCD2 and SCD3.
  • Having hands on experience in Informatica Power Exchange and Informatica IDQ.
  • Hands on experience using query tools like TOAD, PLSQL developer, Teradata SQL Assistant and Query man.

TECHNICAL SKILLS:

UNIX, Windows, MS: DOS

Language/Tools: SQL, PL/SQL, C, C++

Autosys, Control: M, Informatica Scheduler

ETL Tools: Informatica Power Center 10.x/9.x/8.x, ETL Informatica Cloud, SSIS

Database: MS SQL Server, Oracle 8i/9i/10g, RDBMS DB2, Netezza, Teradata, PostgreSQL, Redshift

Scripting: Shell Scripting, Python

Data Modeling Tools: Microsoft Visio, ERWIN 9.3/7.5

Data Modeling ER: (OLTP) and Dimensional (Star, Snowflake Schema)

Data Profiling Tools: Informatica IDQ 10.0, 9.5.1, 8.6.1

Excel Tools & Utilities: TOAD, SQL Developer, SQL*Loader, Putty

Cloud Computing: Amazon Web Services (AWS), S3, RDS, Redshift, SNS

Other Tools: Notepad++, Toad, SQL Navigator, Teradata SQL Assistant, Snaplogic, AWS, Appworx

Defect Tracking Tools: ALM, Quality Center

Reporting Tools: IBM Cognos, Tableau 9

PROFESSIONAL EXPERIENCE:

Confidential, Washington DC

Informatica Developer

Responsibilities:

  • Design Develop and Support various ETL processes in the Data Warehouse Environment including Operational Data Stores, Canonical and Enterprise Dimensional Models and various other Business Intelligence databases
  • Worked for production support team for few months and ensured to clear up the existing defects created by the data governance team
  • Provide On-Call 24/7 Support to while in production support team to take care of the defects on an immediate basis
  • Assist Data Governance team in their tasks to create Source to Target Mapping and Defect Fix Specification documentation as part of Design, Development and Production Support activities
  • Automated multiple redundancy tasks to reduce the time taken for the entire process using the batch files and shell scripts
  • Gathering requirements for source to Data Acquisition layer loads
  • Implemented end-to-end process of reading files from various vendors, validating the files, loading the tables and archiving the files.
  • Created complex mappings to load the data from Oracle to Redshift tables.
  • Perform Query Optimization techniques on long running Oracle/SQL Server queries using Partitions, indexes, parallel hints to reduce the run time of Historical Data Fixes
  • Designed the code to check different scenarios in handling flat files and sending notifications to the Production Support teams as needed
  • Converted business requirements to functional specifications for the team and created mappings to load tables which provide data for reporting purposes
  • Handled the code migrations and production support handover meetings
  • Created the Solution Design Documents (SDS) for various projects to include all the design details, load dependencies, error handling
  • Developed complex source MINUS target queries to validate the data loads

Environment: Informatica PowerCenter 10.2/9.6, Oracle 10g/11g, Redshift, PL/SQL, Putty SSH Client, Appworx, UNIX, Windows XP

Confidential, Centreville, VA

Sr. Informatica/ETL Developer

Responsibilities:

  • Coordinated with business analysts to analyze the business requirements and designed and reviewed the implementation plan.
  • Responsible for designing and development, testing of processes necessary to extract data from operational databases, Transform and Load it into data warehouse using Informatica Power center.
  • Followed ETL standards -Audit activity, Job control tables and session validations.
  • Created Complex Mappings to load data using transformations like Source Qualifier, Expression, Aggregator, Dynamic Lookup, Connected and unconnected lookups, Joiner, Sorter, Filter, Stored Procedures, Sequence, Router and Update Strategy.
  • Created different jobs using UNIX shell scripting to call the workflow by using Command tasks.
  • Writing Oracle SQL queries to join or any modifications in the table.
  • Design and developed complex informatica mappings including SCD Type 2 (Slow Changing Dimension Type 2).
  • Worked on complex mapping for the performance tuning to reduce the total ETL process time.
  • Extensively used TOAD to test, debug SQL and PL/SQL Scripts, packages, stored procedures and functions.
  • Extracted and transformed data from various sources like Teradata and relational databases (Oracle, SQL Server).
  • Created snaplogic jobs for pulling the data from salesforce.
  • Analyze source data coming from multiple sources System. Design and develop data warehouse model in a flexible way to cater the future business needs.
  • Ability to analyze existing systems, conceptualize and design new ones, and deploying innovative solutions with high standards of quality.
  • Development of ETL code to extract data from multiple sources and load to Data warehouse using Informatica and load data into AWS Redshift.
  • Involved in enhancements and maintenance activities of the data warehouse including tuning, code enhancements.
  • Automation and scheduling of Oracle and informatica batch jobs using Control-M application that are scheduled with file watchers, daily, weekly and on special On-Demand requests.
  • Performed Developer testing, Functional testing, Unit testing and created Test Plans and Test Cases.
  • Create Unit Test Case document and capture Unit test results for each source system.
  • Working in Agile environment and experienced with daily stand-ups, sprints, and tracking stories using JIRA application.

Environment: Informatica PowerCenter 10, Control M, Oracle11g, Toad, Redshift, Razor SQL, WinSCP, Composite, Netsuite, Snaplogic, UNIX and TWS.

Confidential, Parsippany, NJ

Sr. Informatica Developer

Responsibilities:

  • Interacting with the end users to get the business Requirements, reporting needs and created Business Requirement Document.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Created mapping documents to outline data flow from sources to targets.
  • Extracted the data from the flat files, DB2, SQL server and other RDBMS databases into staging area and populated onto Data warehouse. Worked on Flat Files and XML, DB2, Oracle as sources.
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
  • Used Debugger to test the mappings and fixed the bugs.
  • Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
  • Generated ABAP programs to load data into Oracle from SAP source systems.
  • Customize ABAP programs according to business requirements to load data with respect to SAP source systems
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Developed mapping parameters and variables to support SQL override.
  • Worked on performance tuning by creating views in Oracle and implemented transformation logics in database using views.
  • Developed mappings to load into staging tables and then to Dimensions and Facts.
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
  • Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Tested all the modules and transported data to target Warehouse tables, scheduled, ran extraction and load process and monitor sessions and batches by using Informatica Workflow Manager and log files.
  • Precise Documentation was done for all mappings and workflows.

Environment: Informatica Power Center 9.6.1, Control M, Oracle11g, Toad, DB2, WinSCP, WinSQL, ERWIN, UNIX and TWS.

Confidential, Portland, OR

Informatica Developer/Admin

Responsibilities:

  • Provided technical leadership and developed new business opportunities.
  • Supported day to day activities of the Data Warehouse.
  • Analyzed the business requirements and functional specifications.
  • Extracted data from oracle database and spreadsheets and staged into a single place and applied business logic to load them in the central oracle database.
  • Used Informatica Power Center 9.1/8.6 for extraction, transformation and load (ETL) of data in the data warehouse.
  • Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup, Update strategy and Sequence generator and Stored Procedure.
  • Developed complex mappings in Informatica to load the data from various sources.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Parameterized the mappings and increased the re-usability.
  • Used Informatica Power Center Workflow manager to create sessions, workflows and batches to run with the logic embedded in the mappings.
  • Created procedures to truncate data in the target before the session run.
  • Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.
  • Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time.
  • Have Experience on Teradata Utility scripts like BTEQ, FastLoad, MultiLoad and FastExport to load data from various source systems to Teradata.
  • Working knowledge on Oracle 11g, SQL Server, Teradata, Netezza, DB2, MY SQL and UNIX shell scripting
  • Extensively used Informatica debugger to figure out the problems in mapping. Also involved in troubleshooting existing ETL bugs.
  • Involved in Unit testing, System testing to check whether the data loads into target are accurate.
  • Created Test cases for the mappings developed and then created integration Testing Document.
  • Used Informatica Metadata Manager to show data lineage.
  • Collaborated with remote offshore team, creating the requirement documents, verifying coding standards and conducting code reviews.

Environment: Informatica Power Center 9.1/8.6, Power Exchange, DB2, SQL server, linux.

Confidential, Nashville, TN

Informatica Developer

Responsibilities:

  • Created Informatica mappings using various transformations like XML, Source Qualifier, Expression, look up, stored procedure, Aggregate, Update Strategy, Joiner, normalizer, Union, Filter and Router in Informatica designer.
  • Extensively worked with Teradata database using BTEQ scripts.
  • Involve in all phase of SDLC, i.e. design, code, test and deploy ETL components of data warehouse and integrated Data Mart .
  • Created subscriptions for source to target mappings and replication methods using IBM CDC tool.
  • Used NZSQL scripts, NZLOAD commands to load data.
  • Experience in Data Stage Upgrade and Migration projects - from planning to execution.
  • Analysis of heterogeneous data from various systems like pm and Salesforce.com and validating it in ODS (operational Data store).
  • Worked with Informatica IDQ Data Analyst, Developer with various data profiling techniques to cleanse, match/remove duplicate data.
  • Identified and eliminated duplicate datasets and performed Columns, Primary Key, Foreign Key profiling using IDQ.
  • Designed and developed IDQ solutions for data profiling. Implemented Address Doctor as Address Validator transformation for data profiling in IDQ.
  • Created mappings using pushdown optimization to achieve good performance in loading data into Oracle and Teradata.
  • Developed various SQL queries using joins, sub-queries & analytic functions to pull the data from various relational DBs i.e. Oracle, Teradata & SQL Server.
  • Worked with cleanse, parse, standardization, validation, scorecard transformations.
  • Production support for the Informatica process, troubleshoot and debug any errors.

Environment: Informatica Data Quality 9.1.0/9.5.1 , Flat Files, Mainframes Files, Oracle 11i, Netezza, Quest Toad Central 9.1, Unix Shell Scripting and Windows 2000, Windows 2003,SQL Server 2005,SQL Server 2008, Salesforce.com, Webservices.

Confidential

DBA /ETL Developer

Responsibilities:

  • Performed business analysis, requirements gathering and converted them into technical specifications
  • Involved in designed the data mart (star schema dimensional modeling) after analyzing various source systems and the final business objects reports
  • Designed and developed all the slowly changing dimensions to hold all the history data in the data mart
  • Developed all the ETL data loads in Informatica Power Center to load data from the source data base into various dimensions and facts in the MIS data mart
  • Used Informatica data services to profile and document the structure and quality of all data.
  • Extensively used Informatica Transformation like Source Qualifier, Rank, SQL, Router, Filter, Lookup, Joiner, Aggregator, Normalizer, Sorter etc. and all the transformation properties.
  • Extensively used Various Data Cleansing and Data Conversion Functions in various transformations.
  • Translated Business processes into Informatica mappings for building Data marts by using Informatica Designer which populated the Data into the Target Star Schema on Oracle 10g Instance.
  • Followed the required client security policies and required approvals to move the code from one environment to other.
  • Created Informatica complex mappings with PL/SQL procedures/functions to build business rules to load data.
  • Identified the bottlenecks in the sources, targets, mappings, sessions and resolved the problems.
  • Created automated scripts to perform data cleansing and data loading.
  • Performed complex defect fixes in various environments like UAT, SIT etc to ensure the proper delivery of the developed jobs into the production environment.
  • Attended daily status call with internal team and weekly calls with client and updated the status report.

Environment: Informatica powercenter 9.X, Informatica cloud, Oracle 10g, PL/SQL,Teradata, linux, Control-M.

Confidential, St. Louis, MO

Informatica Developer

Responsibilities:

  • Developed internal and external Interfaces to send the data in regular intervals to Data warehouse systems.
  • Extensively used Power Center to design multiple mappings with embedded business logic.
  • Involved in discussion of user and business requirements with business team.
  • Performed data migration in different sites on regular basis.
  • Involved in upgrade of Informatica from 9.1 to 9.5.
  • Portfolio Management Enhancement: Analyzed/developed business requirement, designed/created database and processes to load data from Broad ridge using Informatica, SQL Server etc.; fixed defects; Wrote complicate stored procedures to generate data for PM web reports.
  • Created complex mappings using Unconnected Lookup, Sorter, and Aggregator and Router transformations for populating target tables in efficient manner.
  • Attended the meetings with business integrators to discuss in-depth analysis of design level issues.
  • Provide work Bucket hour estimation and budgeting for each story (agile process) and communicate status to PM.
  • Involved in data design and modeling by specifying the physical infrastructure, system study, design, and development.
  • Extensively involved in performance tuning of the Informatica ETL mappings by using the caches and overriding the SQL queries and by using Parameter files.
  • Analyzed session log files in session failures to resolve errors in mapping or session configuration.
  • Written various UNIX shell Scripts for scheduling various data cleansing scripts, loading process and automating the execution of maps.
  • Worked under Agile methodology and used Rally tool one to track the tasks.
  • Performed bulk data imports and created stored procedures, functions, views and queries.

Environment: Informatica MDM, Autosys, Oracle11g, SAP, Toad, WinSQL, ERWIN, UNIX.

Confidential

ETL/SQL Developer

Responsibilities:

  • Co-ordinated Joint Application Development (JAD) sessions with Business Analysts and source developer for performing data analysis and gathering business requirements.
  • Developed technical specifications of the ETL process flow.
  • Designed the Source - Target mappings and involved in designing the Selection Criteria document.
  • Worked on design and development of Informatica mappings, workflows to load data into staging area, data warehouse and data marts in Teradata.
  • Used Informatica PowerCenter to create mappings, sessions and workflows for populating the data into dimension, fact, and lookup tables simultaneously from different source systems (SQL server, Oracle, Flat files).
  • Created mappings using various Transformations like Source Qualifier, Aggregator, Expression, Filter, Router, Joiner, Stored Procedure, Lookup, Update Strategy, Sequence Generator and Normalizer.
  • Worked Extensively with SSIS to import, export and transform the data between Used T-SQL for Querying the SQL Server2000 database for data validation and data conditioning.
  • Implemented Informatica Framework for (dynamic parameter file generation, start, failed and succeeded emails for an integration, Error handling and Operational Metadata Logging).
  • Implemented sending of Post-Session Email once data is loaded.
  • Worked with DBA for partitioning and creating indexes on tables used in source qualifier queries.
  • Involved in Performance/Query tuning. Generation/interpretation of explain plans and tuning SQL to improve performance.
  • Scheduled various daily and monthly ETL loads using Autosys.
  • Involved in Production Support in resolving issues and bugs.

Environment: Informatica Power Center 8.6. PL/SQL, Oracle 10g/9i, Erwin, Autosys, SQL Server 2005, Sybase, UNIX AIX.

Hire Now