We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

Irving, TX

SUMMARY:

  • 9+ years of IT experience in Analysis, Design, Developing and Testing and Implementation of business application systems.
  • Highly skilled ETL Engineer with 9+ years of software development in tools like Informatica/SSIS/Talend.
  • Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.
  • 3+ years Experience on Talend ETL Enterprise Edition for Big data/Data integration/Data Quality.
  • Experience in Big Data technologies like Hadoop/Map Reduce, Pig, HBASE, Hive, Sqoop, Hbase, Dynamodb, Elastic Search and Spark SQL.
  • Experienced in ETL methodology for performing Data Migration, Data Profiling, Extraction, Transformation and Loading using Talend and designed data conversions from large variety of source systems including Oracle 10g/9i/8i/7.x, DB2, Netezza, SQL server, Teradata, Hive, Hana and non - relational sources like flat files, XML and Mainframe files.
  • Involved in code migrations from Dev. to QA and production and providing operational instructions for deployments.
  • Worked hands on Datastage 8.7 ETL migration to Talend Studio ETL process.
  • Expertise in creating mappings in TALEND using tMap, tJoin, tReplicate, tParallelize, tConvertType,, tflowtoIterate, tAggregate, tSortRow, tFlowMeter, tLogCatcher, tRowGenerator, tNormalize, tDenormalize, tSetGlobalVar, tHashInput, tHashOutput, tJava, tJavarow, tAggregateRow, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap, tDie etc
  • Strong Data Warehousing ETL experience of using Informatica 9.x/8.x/7.x Power Center Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools - Informatica Server, Repository Server manager.
  • Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.
  • Hands on experience in Pentaho Business Intelligence ServerStudio.
  • Expertise in Using transformations like Joiner, Expression, Connected and Unconnected lookups, Filter, Aggregator, Store Procedure, Rank, Update Strategy, Java Transformation, Router and Sequence generator.
  • Experienced on writing Hive queries to load the data into HDFS.
  • Extensive experience on Pentaho report designer, Pentaho kettle, Pentaho BI server, BIRT report designer
  • Experienced on designing ETL process using Informatica Tool to load from Sources to Targets through data Transformations.
  • Hands on experience in developing and monitoring SSIS/SSRS Packages and outstanding knowledge of high availability SQL Server solutions, including replication.
  • Excellent experience on designing and developing of multi-layer Web Based information systems using Web Services including Java and JSP.
  • Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.
  • Expertise in working with relational databases such as Oracle 12c/11g/10g/9i/8x, SQL Server 2012/2008/2005, DB2 8.0/7.0, UDB, MS Access and Teradata, Netezza.
  • Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, TSQL and Oracle PL/SQL.
  • Experienced with Teradata utilities FastLoad, MultiLoad, BTEQ scripting, FastExport, SQL Assitant.
  • Experienced in integration of various data sources like Oracle SQL, PL/SQL, Netezza, SQL server and MS access into staging area.
  • Experienced in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Experienced in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documenting.
  • Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting, and also used Netezza Utilities to load and execute SQL scripts using Unix
  • Proficient in the Integration of various data sources with multiple relational databases like Oracle12c/11g / Oracle10g/9i, MS SQL Server, DB2, Teradata, VSAM files and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
  • Extensively worked with Netezza database to implement data cleanup, performance tuning techniques
  • Experienced in using Automation Scheduling tools like Autosys and Control-M.
  • Experience in data migration with data from different application into a single application.
  • Responsible for Data migration from mySQL Server to Oracle Databases
  • Experienced in batch scripting on windows and worked extensively with slowly changing dimensions.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, system integration and user acceptance testing.

TECHNICAL SKILLS:

Operating Systems: Windows 2008/2007/2005/ NT/XP, UNIX, MS-DOS

ETL Tools: Talend, TOS, TIS, Informatica Power Center 9.x/8.x/7.x/6.x (Designer, Workflow Manager, Workflow Monitor, Repository manager and Informatica Server),, SSIS, Ab-Initio.

Databases: Oracle 12c/11g/10g/9i/8i, MS SQL Server 2012 /2008/2005, DB2 v8.1, Netezza, Teradata, Hbase.

Methodologies: Data Modeling - Logical Physical Dimensional Modeling - Star / Snowflake

Languages: SQL, PL/SQL, UNIX, Shell scripts, C++, SOAP UI, JSP, Web Services, Java Script, HTML, Eclipse

Scheduling Tools: Autosys, Control-M

Testing Tools: QTP, WinRunner, LoadRunner, Quality Center, Test Director, Clear test, Clear case.

PROFESSIONAL EXPERIENCE:

Confidential, Irving, TX

Sr. Informatica developer

Responsibilities:

  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
  • Used Address Doctor extensively for Address validations. Built several reusable components on IDQ using Standardizers and Reference tables which can be applied directly to standardize and enrich Address information.
  • Experience in extracting addresses from multiple heterogeneous source like flat files, oracle, SAS and SQL server.
  • Created custom rules to validate zip codes, states and segregated address data based on country.
  • Created web services for address mapplets of different countries to integrate with SOAP UI.
  • Used Informatica MDM 10.1 (Siperion) tool to manage Master data of EDW.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Cleanse and Scrub the Data in uniform data type and format applying MDM Informatica and IDQ tools and Load to STAGE and HUB tables and then to the EDW and Finally to Dimension and rollup/aggregate the data by the business grains into the FACT tables.
  • Primary activities include data analysis identifying and implementing data quality rules in IDQ and finally linking rules including Address Doctor to Power Center ETL processes and delivery to MDM Data Hub and other data consumers.
  • Developed MDM Hub Match and Merge rules Batch jobs and Batch groups.
  • Exposure to Informatica B2B Data Exchange that allows to support the expanding diversity of customers and partners and their data with capabilities that surpass the usual B2B solutions
  • Worked with the B2B Operation console to create the partners, configure the Partner management, Event Monitors and the Events
  • Exposure to Informatica B2B Data Transformation that supports transformation of structured, unstructured, and semi-structured data types while complying with the multiple standards which govern the data formats. .
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
  • Involved in Analyzing / building Teradata EDW using Teradata ETL utilities and Informatica.
  • Modified existing mappings for enhancements of new business requirements.
  • Used Debugger to test the mappings and fixed the bugs.
  • Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.

Environment: Informatica Power Center 10/9.6.1, Workflow Manager, Workflow Monitor, Informatica Power Connect / Power Exchange, IDQ 9.6/0/9.5, MDM, B2B Adapter, Data Analyzer 9.1, PL/SQL, Oracle 11g, Erwin, Auto sys, ERP, Ultra Edit, SQL Server 2005, Sybase, UNIX AIX, Toad 9.0, Cognos9.

Confidential, Rocky hill, CT

Sr. ETL/Talend Developer

Responsibilities:

  • As a consultant studied the existing DataMarts to understand and integrate the new source of data.
  • Managing the off shore support group in India for support issue as well as small enhancements for data warehouse.
  • Preparing the weekly status report and coordinating weekly status calls with technology lead/business
  • Designed and created new Informatica jobs to implement new business logic into the existing process.
  • Using Informatica modules ( Repository Manager, Designer, Workflow Manager and Workflow Monitor) to accomplish end to end ETL process.
  • Performed data profiling with Sources to analyse the content, quality and structure of source data
  • During mapping development.
  • Created required scripts/transformations to extract the source data from various sources such as Oracle, Flat Files etc.
  • Used all the complex functionality of informatica (Mapplets, Stored Procedures, Normalizer,
  • Update Strategy, Router, Joiner, Java, SQL Transformation Etc...) to interpret the business logic into
  • The ETL mappings.
  • Designed and developed complex aggregate, joiner, lookup transformations to implement the business rules in the ETL mappings to load the target Facts and Dimensions.
  • Defined Target Load Order Plan for loading data into Target Tables
  • Used Mapplets and Reusable Transformations prevent redundancy of transformation usage and maintainability.
  • Created Complex Informatica mappings and in other hand Simple mappings with Complex SQLs in it based on the need or requirement of business user.
  • Used Informatica’s features to implement Type 1, 2, 3changes in slowly changing dimension Change Data Capture (CDC)
  • Different database triggers Created and configured workflows, worklets & Sessions to transport the data to target systems using Informatica Workflow Manager.
  • Fine-tuned the session performance using Session partitioning for long running sessions.
  • Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
  • Used Versioning, Labels and Deployment group in the production move process.
  • Automation of Workflow using UNIX scripts using PMCMD, PMserver commands.
  • Setup Permissions for Groups and Users in all Environments (Dev, UAT and Prod).
  • Createdtables, views, primary keys, indexes, constraints, sequences, grants and synonym.
  • Involved in developing optimized code using PL/SQL for Server related Packages to centralize the application through procedures containing PL/SQL were created and stored in the database and fired off when contents of database were changed.
  • Used debugger to test the mapping and fixed the bugs.
  • Conducted Design and Code reviews and extensive documentation of standards, best practices and ETL Procedures.
  • Used Workflow Manager for Creating, Validating, Testing and running the sequential and concurrent Sessions and scheduling them to run at specified time.
  • Developed Oracle Stored Procedures, Packages and Functions and utilized in ETL Process.
  • Handled the performance tuning of Informatica Mappings at various level to accomplish theestablished standard throughput.
  • Analysed the Target Data mart for accuracy of data for the pre-defined reporting needs.
  • Wrote complex SQLs to achieve and interpret the reporting needs into the ETL Process. Also worked on SQL tuning to achieve the maximum throughput.
  • Assisted in all aspects of the project to meet the scheduled delivery time..
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
  • Conducted unit testing of all ETL mappings as well as helped QA team in conducting their testing.
  • Wrote UNIX shell scripts to work with flat files, to define parameter files and to create pre and post session commands.
  • Used Autosys Tool to schedule shell scripts and Informatica jobs.
  • Performed Unit, Grid Integration, Testing and validate results with end users.
  • Worked as a part of a team and provided 7 x 24 production support.

Environment: Informatica Power Center 9.5, Erwin, MS Visio, Oracle 11g, SQL, PL/SQL, Oracle Sql Developer Tool, SQL Server 2008, Flat Files, XML, Mainframe, Cobol Files, Autosys, UNIX Shell Scripting, Subversion.

Confidential, Rocky hill, CT

ETL/Talend Developer

Responsibilities:

  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Performed data manipulations using various Talend components like tMap, tJavaRow, tjava, tOracleRow, tOracleInput, tOracleOutput, tMSSQLInput and many more.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time
  • Designed ETL process using Talend Tool to load from Sources to Targets through data Transformations.
  • Extensive experience on Pentaho designer, Pentaho kettle, Pentaho BI server, BIRT report designer
  • Developed advanced Oracle stored procedures and handled SQL performance tuning.
  • Involved in creating the mapping documents with the transformation logic for implementing few enhancements to the existing system.
  • Monitored and supported the Talend jobs scheduled through Talend Admin Center (TAC)
  • Developed the Talend mappings using various transformations, Sessions and Workflows. Teradata was the target database, Source database is a combination of Flat files, Oracle tables, Excel files and Teradata database .
  • Loaded data in to Teradata Target tables using Teradata utilities (FastLoad, MultiLoad, and FastExport) Queried the Target database using Teradata SQL and BTEQ for validation.
  • Used Talend to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and flat files.
  • Created connection to databases like SQL Server, oracle, Netezza and application connections.
  • Created mapping documents to outline data flow from sources to targets.
  • Prepare the Talend job level LLD documents and working with the modeling team to understand the Big Data Hive table structure and physical design.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings.
  • Developed mapping parameters and variables to support SQL override.
  • Developed Talend ESB services and deployed them on ESB servers on different instances.
  • Created mapplets & reusable transformations to use them in different mappings.
  • Developed mappings to load into staging tables and then to Dimensions and Facts.
  • Developed the Talend jobs and make sure to load the data into HIVE tables & HDFS files and develop the Talend jobs to integrate with Teradata system from HIVE tables
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
  • Unit testing, code reviewing, moving in UAT and PROD.
  • Designed the Talend ETL flow to load the data into hive tables and create the Talend jobs to load the data into Oracle and Hive tables.
  • Migrated the code into QA (Testing) and supported QA team and UAT (User).
  • Created detailed Unit Test Document with all possible Test cases/Scripts.
  • Working with high volume of data and tracking the performance analysis on Talend job runs and session.
  • Conducted code reviews developed by my team mates before moving the code into QA.
  • Experience in Batch scripting on windows, Windows 32 bit commands, Quoting, Escaping.
  • Used Talend reusable components like routines, context variable and globalMap variables.
  • Provided support to develop the entire warehouse architecture and plan the ETL process.
  • Knowledge on Teradata Utility scripts like FastLoad, MultiLoad to load data from various source systems to Teradata.
  • Modified existing mappings for enhancements of new business requirements.
  • Worked on Migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Netezza.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Configured the hive tables to load the profitability system in Talend ETL Repository and create the Hadoop connection for HDFS cluster in Talend ETL repository
  • Works as a fully contributing team member, under broad guidance with independent planning & execution responsibilities.

Environment: Talend, TOS, TIS, Hive, Pig, Hadoop 2.2, Sqoop, PL/SQL, Oracle 12c/11g/, Erwin, Autosys, SQL Server 2012, Teradata, Netezza, Sybase, SSIS, UNIX, Profiles, Role hierarchy, Workflow & Approval processes, Data Loader, Reports, Custom Objects, Custom Tabs, Data Management, Lead processes, Record types.

Confidential

Informatica Developer

Responsibilities:

  • Designed the ETL processes using Informatica to load data from Oracle, Flat Files to target Oracle Data Warehouse database.
  • Based on the requirements created Functional design documents and Technical design specification documents for ETL.
  • Created tables, views, indexes, sequences and constraints .
  • Developed stored procedures, functions and database triggers using PL/SQL according to specific business logic.
  • Transferred data using SQL Loader to database .
  • Involved in testing of Stored Procedures and Functions. Designed and developed table structures, stored procedures, and functions to implement business rules.
  • Implemented SCD methodology including Type 1 and Type 2 changes.
  • Used legacy systems , Oracle, and SQL Server sources to extract the data and to load the data.
  • Involved in design and development of data validation, load process and error control routines.
  • Used pmcmd to run workflows and created Cron jobs to automate scheduling of sessions.
  • Involved in ETL process from development to testing and production environments.
  • Analyzed the database for performance issues and conducted detailed tuning activities for improvement
  • Generated monthly and quarterly drugs inventory/purchase reports.
  • Coordinated database requirements with Oracle programmers and wrote reports for sales data.

Environment: Informatica Power Center7.1, Ora cle 9, SQL Server 2005, XML, SQL, PL/SQL, UNIX Shell Script.

Confidential

Software Engineer

Responsibilities:

  • Involved in creating database objects like tables, stored procedures, views, triggers, and user defined functions for the project which was working on.
  • Analyze the client requirements and translate them into technical requirements.
  • Gathered requirements from the end user and involved in developing logical model and implementing requirements in SQL server 2000.
  • Data migration (import & export - BCP) from text to SQL server.
  • Responsible for creating reports based on the requirements using reporting services 2000.
  • Identified the database tables for defining the queries for the reports.
  • Worked on SQL server queries, stored procedures, triggers and joins.
  • Defined report layouts for formatting the report design as per the need.
  • Identified and defined the datasets for report generation.
  • Formatted the reports using global variables and expressions.
  • Deployed generated reports onto the report server to access it through browser.
  • Maintained data integrity by performing validation checks.

Environment: : MS SQL 2000, Windows server 2000, SQL Query Analyzer and Enterprise Manager, MS Access 2000 & Windows NT platform.

Hire Now