We provide IT Staff Augmentation Services!

Sr. Informatica Etl Developer Resume

Overland Park, KS

SUMMARY:

  • 8+ years of IT experience in design, analysis, development, documentation, coding, and implementation including Databases, Data Warehouse, ETL Design, Oracle, PL/SQL, SQL Server databases, SSIS, Informatica Power Center 9.x/8.x/7.x, Informatica Data Quality.
  • Expertise in Master Data Management concepts, Methodologies, and ability to apply this knowledge in building MDM solutions
  • Involved in complete software development life cycle (SDLC) of project with experience in domains like Healthcare, Banking, Insurance and Appliances.
  • Expertise in the ETL Tool Informatica and have extensive experience in Power Center Client tools including Designer, Repository Manager, Workflow Manager/ Workflow Monitor
  • Extensively worked with complex mappings using various transformations like Filter, Joiner, Router, and Source Qualifier.
  • Expression, Union, Unconnected and Connected Lookup, Aggregator, Stored Procedure, XML Parser, Normalize, Sequence Generator, Update Strategy, Reusable Transformations, User Defined Functions, etc.
  • Extensively worked on Relational Databases Systems like Oracle MS SQL Server, Teradata and Source files like flat files, XML files and COBOL files
  • Excellent background in implementation of business applications and in using RDBMS and OOPS concepts.
  • Expert in Data Extraction, Transforming and Loading (ETL) using SQL Server Integration Services (SSIS)
  • Extensive experience in designing, developing, and delivering business intelligence solutions using SQL Server Integration Services(SSIS)and Reporting Services (SSRS)
  • Good exposure on Informatica Cloud Services. Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities using IDQ 9.6.1.
  • Expertise in Data warehousing concepts like OLTP/OLAP System Study, Analysis, and E - R modeling, developing database Schemas like Star schema and Snowflake schema used in relational, dimensional, and multidimensional data modeling.
  • Experienced in mapping techniques for Type 1, Type 2, and Type 3 Slowly Changing Dimensions. Hands-on experience in Informatica upgrade from 9.1 to 10.x
  • Strong experience in SQL, PL/SQL, Tables, Database Links, Materialized Views, Synonyms, Sequences, Stored Procedures, Functions, Packages, Triggers, Joins, Unions, Cursors, Collections, and Indexes in Oracle.
  • Experience with the healthcare data in HIPPA ANSI X12. Strong Experience in Designing ETL packages using different data sources (SQL Server, Flat Files, and XMLs etc.)
  • Loaded the data into target data sources by performing different kinds of transformations using SQL Server Integration Services (SSIS).
  • Sound knowledge of Linux/UNIX, Shell scripting, experience in command line utilities like pmcmd to execute workflows in non-windows environments
  • Proficiency in working with Teradata utilities like (BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD, Teradata Administrator, SQL Assistant, PMON, Visual Explain).
  • Implemented change data capture (CDC) using Informatica power exchange to load data from clarity DB to TERADATA warehouse.
  • Experience in integration of various data sources with Multiple Relational Databases like DB2, Oracle, SQL Server, MS Access, Teradata, Flat Files, XML files and other sources like Salesforce, etc.
  • Experience in Migrating Data from Legacy systems to Oracle database using SQL*Loader. Proficient in Oracle Tools and Utilities such as TOAD and SQL*Loader.
  • Experienced in scheduling Sequence and parallel jobs using DataStage Director, UNIX scripts and scheduling tool (Control-M v7/v8), CA WA Workstation (ESP)
  • Expert in analyzing Business & Functional Specifications, creating Technical Design Document and Unit Test cases
  • Extensive Experience in Agile Methodology and waterfall methodologies of SDLC. Experience in Performance Tuning of targets, sources, mapping, workflow, and system.
  • Identified and fixed bottlenecks and tuned the complex Informatica mappings for better Performance.
  • Involved in SDLC- software development life cycle (Water, Scrum/Agile) of building a Data Warehouse on windows and Unix Platforms.
  • Well versed with onsite/offshore project delivery model and experience in working with offshore teams
  • Designed Applications according to the customer requirements and specifications. Excellent Interpersonal and Communication skills, coupled with strong technical and problem-solving capabilities.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center Informatica Power ExchangeMetadata Manager, Informatica Data Quality (IDQ), Informatica Data Explorer (IDE)MDM, SSIS, Salesforce, Data Stage, etc.

Reporting Tools: Business Objects Qlik View, OBIEE, MicroStrategy, Oracle Analytics, etc.Scheduling tools Informatica Scheduler, CA Scheduler (Autosys), ESP, Maestro, Control-M.

Data Modeling: Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flake, FACTDimensions), Physical, Logical Data Modeling, and ER Diagrams.

Hadoop: Ecosystem HDFS, Hive, Map Reduce, Sqoop, Syncsort.

DB Tools: SQL Server Management Studio (2008), Oracle SQL Developer (3.0), Toad 11.6, SSISMicrosoft SSRS (Oracle), DB2, Teradata, AQT v9 (Advanced query tool) (Oracle/Netezza), DB Artisan 9.0 (Sybase), SQL Browser (Oracle Sybase), VisioERWIN

Languages: C, C++, Java, .Net, Perl Scripting, Shell Scripting, XSLT, PL/SQL, T-SQL.

Operating Systems: UNIX, Linux, Windows

PROFESSIONAL EXPERIENCE:

Confidential, Overland Park, KS

Sr. Informatica ETL Developer

Responsibilities:

  • Prepared High-level Design and Low-Level Design based on Functional and Business requirements of the project.
  • Designing & documenting the functional specs and preparing the technical design.
  • As a team conducted gap analysis and Discussions with subject matter experts to gather requirements, emphasize on problem areas and define deliverables.
  • Supported the development, optimization, and maintenance of Informatica mappings with various source data including Oracle and SQL.
  • Optimized and Tuned SQL queries and PL/SQL blocks to eliminate Full Table scans to reduce Disk I/O and Sorts.
  • Develop a POC for Informatica Power Exchange for HDFS and Hive on multinode Hadoop distribution on Cloudera
  • Designed and developed UNIX Shell scripts for creating, dropping tables which are used for scheduling the jobs.
  • Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.
  • Developed several complex mappings in Informatica a variety of Power Center transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files.
  • Create the system to extract, transform and load market data, check correctness of data loading, use UNIX Korn shell and Perl, Oracle stored procedures.
  • Developed banking management scripts in python to support the chase website in creating user profiles, transactions for the withdrawals and deposit.
  • Worked with IDQ on data quality for data cleansing, robust data, remove the unwanted data, correctness of data.
  • Extensive experience in integration of Informatica Data Quality (IDQ) with Informatica Power Center. Involved in Migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata
  • Create MDM mappings using IHA best practices to capture the errors and have a clean load using Informatica power center.
  • Converted SAS scripts to Informatica. Migrated SSIS to Informatica. Created data Informatica Cloud synchronization tasks & amp; Tasks flows to extract the data from Salesforce and loading the data from files to Salesforce.
  • Developed mappings /Reusable Objects/Transformation/mapplets by using mapping designer, transformation developer and mapplet designer in Informatica Power Center 9.1/9.5/9.6.1/10.1.
  • Created functional document with MDM data model configuration, source system definition, data mapping and cleansing requirements, trust score and matching rule definitions
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Worked with Teradata team in loading the data using relational connections and using different utilities to load the data like BTEQ, Fast Load, Multi Load, Fast Export, TPUMP, TPT scripts.
  • Developed mapping parameters and variables to support SQL override. Worked on import and export of data from sources to Staging and Target using Teradata MLOAD, Fast Export, TPUMP and BTEQ.
  • Developing workflows with Worklets, Event waits, Assignments, Conditional flows, Email and Command Tasks using Workflow Manager.
  • Working with an Agile, Scrum methodology to ensure delivery of high quality work with every monthly iteration.
  • Extensive Data modeling experience using Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables.
  • Handling all Hadoop environments builds, including design, capacity planning, cluster setup, performance tuning and ongoing monitoring.
  • Worked with the third party scheduler Autosys for scheduling Informatica PowerCenter Workflows Involved with Scheduling team in creating and scheduling jobs in Autosys Workload Scheduler.
  • Extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.

Environment: Informatica Power Center 10.x/9.6, Oracle 11g, SQL, IDQ, DB2, Hadoop, Teradata, DataStage, PL/SQL, PERL, Python, TOAD, Microsoft Visio, Autosys, Unix, SQL Server 2008.

Confidential, Overland Park, KS

Sr. Informatica ETL Developer

Responsibilities:

  • Development of scripts for loading the data into the base tables in EDW using Fast Load, Multi Load and BTEQ utilities of TERADATA.
  • Worked on TERADATA SQL, BTEQ, M Load, Fast Load, and Fast Export for Ad-hoc queries, and build UNIX shell script to perform ETL interfaces BTEQ, Fast Load or Fast Export. Created numerous Volatile, Global, Set, Multi Set tables.
  • Extensively used ETL Tool Informatica to load data from Flat Files, Oracle, SQL Server, Teradata etc. Developed data mappings between source systems and target system using Mapping Designer.
  • Developed shared folder architecture with reusable Mapplets and Transformations. Created batch jobs for Fast Export. Prepared Conceptual Solutions and Approach documents and gave Ballpark estimates.
  • Design and developed Amazon Redshift databases. Worked with XML, Redshift, Flat file connectors.
  • Prepared Business Requirement Documents, Software Requirement Analysis and Design Documents (SRD) and Requirement Traceability Matrix for each project workflow based on the information gathered from Solution Business Proposal document.
  • Data modeling and design of data warehouse and data marts in star schema methodology with confirmed and granular dimensions and FACT tables.
  • Used Teradata Utilities fast load, multiload, t pump to load data. Created SSIS ETL packages to get data from different sources like Flat files, MS Excel, MS Access.
  • Involved in Migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata Develop a POC for Informatica Power Exchange for HDFS and Hive on multinode Hadoop distribution on Cloudera
  • Responsible for optimization of SQL queries, T-SQL and SSIS Packages. Loading data from large data files into Hive tables.
  • Importing and exporting data into HDFS and Hive using Sqoop. Coordinated and worked closely with legal, clients, third-party vendors, architects, DBA's, operations and business units to build and deploy.
  • Used Data stage to manage the Metadata repository and for import /export for jobs. Worked with Connected and Unconnected Stored Procedure for pre & post load sessions
  • Designed and Developed pre-session, post-session routines and batch execution routines using Informatica Server to run Informatica sessions.
  • Used PMCMD command to start, stop and ping server from UNIX and created Shell scripts to automate the process.
  • Created data synchronization tasks & amp; Tasks flows to extract the data from Salesforce and loading the data from files to Salesforce.
  • Used Informatica Power Exchange for Mainframe was used to read/write VSAM files from/to the Mainframe.
  • Working with an Agile, Scrum methodology to ensure delivery of high quality work with every monthly iteration.
  • Worked on production tickets to resolve the issues in a timely manner. Prepared Test Strategy and Test Plans for Unit, SIT, UAT and Performance testing.

Environment: Informatica Power Center 9.x, IDQ, Hadoop, Hive, Oracle 10g, Metadata, datastage 8.7, Teradata, SQL Server 2008, SSIS, SSRS, T-SQL Toad, SQL Plus, SQL Query Analyzer, SQL Developer, MS Access, Tivoli Job Scheduler, Windows Azure.

Confidential, Shawnee Mission, KS

Sr. Informatica ETL Developer

Responsibilities:

  • Worked with Business Analysts (BA) to analyze the Data Quality issue and find the root cause for the problem with the proper solution to fix the issue
  • Document the process that resolves the issue which involves analysis, design, construction and testing for Data quality issues
  • Involved in doing the Data model changes and other changes in the Transformation logic in the existing Mappings according to the Business requirements for the Incremental Fixes
  • Worked extensively with Informatica tools like Source Analyzer, Warehouse Designer, Transformation Developer, and Mapping Designer.
  • Extracted data from various heterogeneous sources like DB2, Sales force, Mainframes, Teradata, and Flat Files using Informatica Power center and loaded data in target database.
  • Created and Configured Landing Tables, Staging Tables, Base Objects, Foreign key relationships, Queries, Query Groups etc. in MDM.
  • Defined the Match and Merge Rules in MDM Hub by creating Match Strategy, Match columns and rules for maintaining Data Quality
  • Extensively worked with different transformations such as Expression, Aggregator, Sorter, Joiner, Router, Filter, and Union in developing the mappings to migrate the data from source to target.
  • Used connected and Unconnected Lookup transformations and Lookup Caches in looking the data from relational and Flat Files. Used Update Strategy transformation extensively with DD INSERT, DD UPDATE, DD REJECT, and DD DELETE.
  • Extensively Implemented SCD TYPE 2 Mappings for CDC (Change data capture) in EDW. Code walkthrough and Review of documents which are prepared by other team members.
  • Involved in doing Unit Testing, Integration Testing, and Data Validation. Migration of Informatica maps extensively involved in migration from Dev to SIT and UAT to Production environment.
  • Created data Informatica Cloud synchronization tasks & amp; Tasks flows to extract the data from Salesforce and loading the data from files to Salesforce.
  • Developed UNIX script to sftp, archive, cleanse and process many flat files. Created and ran Pre-existing and debug sessions in the Debugger to monitor and test the sessions prior to their normal run in the Workflow Manager
  • Extensively worked in migrating the mappings, worklets and workflows within the repository from one folder to another folder as well as among the different repositories.
  • Created Mapping parameters and Variables and written parameter files. Implemented various Performance Tuning techniques by finding the bottle necks at source, target, mapping, and session and optimizing them.
  • Used SQL queries and database programming using PL/SQL (writing Packages, Stored Procedures/Functions, and Database Triggers).
  • Worked on Scheduling Jobs and monitoring them through Control M and CA scheduler tool (Autosys).
  • Working with an Agile, Scrum methodology to ensure delivery of high quality work with every monthly iteration.
  • Responsible for optimization of SQL queries, T-SQL and SSIS Packages. Used SQL tools like TOAD to run SQL queries and validate the data in warehouse.
  • Worked with the SCM code management tool to move the code to Production. Extensively worked with session logs and workflow logs in doing Error Handling and trouble shooting.

Environment: Informatica Power Center 10.1/9.6.1/9.1.0 , Informatica Developer Client, IDQ, MDM, Power Exchange, DB2, SAP, Oracle 11g/10g, Hadoop HDFS, Hive, Sqoop, Syncsort, PL/SQL, TOAD, SQL SERVER 2005/2008, XML, UNIX, Windows XP, OBIEE, Teradata.

Confidential, Kansas City, KS

Sr. Informatica ETL Developer

Responsibilities:

  • Involved in all the phases of SDLC. Worked closely with business analysts and data analysts to understand and analyze the requirement to come up with robust design and solutions.
  • Involved in standardization of Data like of changing a reference data set to a new standard. Data if validated from third party before providing to the internal transformations should be checked for its accuracy (DQ).
  • Involved in massive data profiling prior to data staging. Created profiles and score cards for the users.
  • Created Technical Specification Documents and Solution Design Documents to outline the implementation plans for the requirements.
  • Designed the mappings according to the OBIEE specifications like SDE (Source Dependent Extraction) and SIL (Source Independent Extraction).
  • Involved in testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 9.5 by using various transformations like Expression, Source Qualifier, Filter, Router, Sorter, Aggregator, Update Strategy, Connected and unconnected look up etc.
  • Extracted data from various heterogeneous sources like DB2, Sales force, Mainframes, Teradata, and Flat Files using Informatica Power center and loaded data in target database.
  • Created Informatica components required to operate Data Quality (Power Center required) designed best practices on Process Sequence, Dictionaries, Data Quality Lifecycles, Naming Convention, and Version Control.
  • Developed the required Informatica mappings to process the data into Dimension and facts tables which satisfy the OBIEE reporting rules by interacting with reporting team.
  • Developed scripts for creating tables, views, synonyms and materialized views in the data mart. Extensive Data modeling experience using Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables. Created PL/SQL
  • Programs like procedures, function, packages, and cursors to extract data from Target System. Utilized dimensional and star-schema modeling to come up with new structures to support drill down.
  • Converted business requirements into highly efficient, reusable and scalable Informatica ETL processes. Created mapping documents to outline source-to-target mappings and explain business-driven transformation rules.
  • Data if sourced from database that has valid not null columns should not undergo DQ check for completeness.

Environment: Informatica Power Center 9.1/8.6, PL/SQL Developer, OBIEE, IDQ, Oracle 11g, UNIX, Microsoft SQL Server, TOAD, Teradata, Netezza.

Confidential, Topeka, KS

Sr. Informatica ETL Developer

Responsibilities:

  • Analyzed the functional specifications provided by the data architect and created Technical System Design Documents and Source to Target mapping documents.
  • Converted the data mart from Logical design to Physical design, defined data types, Constraints, Indexes, generated Schema in the Database, created Automated scripts, defined storage parameters for the objects in the Database.
  • Performed Source System Data Profiling using Informatica Data Explorer (IDE). Involved in designing Staging and Data mart environments and built DDL scripts to reverse engineer the logical/physical data model using Erwin.
  • Extracted data from SAP using Power Exchange and loaded data into SAP systems. Translated the business processes/SAS code into Informatica mappings for building the data mart.
  • Implemented pushdown, pipeline partition, persistence cache for better performance. Developed reusable transformations and Mapp let’s to use in multiple mappings.
  • Implementing Slowly Changing Dimensions (SCD) methodology to keep track of historical data. Assisted the QC team in carrying out its QC process of testing the ETL components.
  • Created pre-session and post-session shell scripts and email notifications. Involved in complete cycle from Extraction, Transformation and Loading of data using Informatica best practices.
  • Involved in Data Quality checks by interacting with the business analysts. Performing Unit Testing and tuned the mappings for the better performance.
  • Maintained documentation of ETL processes to support knowledge transfer to other team members.
  • Extracted data from various heterogeneous sources like DB2, Sales force, Mainframes, Teradata, and Flat Files using Informatica Power center and loaded data in target database.
  • Created various UNIX Shell Scripts for scheduling various data cleansing scripts and automated execution of workflows.
  • Involved as a part of Production support. Informatica Power Exchange for Mainframe was used to read/write VSAM files from/to the Mainframe.
  • Designed Audit table for ETL and developed Error Handling Processes for Bureau Submission. Managed Change Control Implementation and coordinating daily, monthly releases and reruns.
  • Responsible for code migration, code review, test Plans, test scenarios, test cases as part of Unit/Integrations testing, UAT testing.
  • Used Teradata Utilities such as M load, F load and T pump. Used UNIX scripts for automating processes

Environment: Informatica Power Center 9.1.1, Informatica Developer Client, IDQ, Power Exchange, SAP, Oracle 11g, PL/SQL, TOAD, SQL SERVER 2005/2008, XML, UNIX, Windows XP, OBIEE, Teradata.

Hire Now