We provide IT Staff Augmentation Services!

Etl Developer Resume

Concord, NH

SUMMARY:

  • Over 7 years of experience in Dataware housing using Informatica and Talend as an ETL tool.
  • Exposure to various domains like Health care, Insurance, Banking, Retail Industry and Financing.
  • Conversant with all phase of Software Development Life Cycle (SDLC) involving System Analysis, Design, Development and Implementation.
  • Good at Relational Database Management System (RDBMS) concepts.
  • Working Knowledge on Kimball/Inmon Methodologies.
  • Experience in Data Extraction, Transformation and Loading data using Informatica Power Center 9.5.1/9.1.1/8. x/7.x.
  • Worked on tools like Repository Manager, Workflow Manager, Workflow Monitor, Designer consists of objects like Mapping Designer, Transformation Developer, Mapplet Designer, Source Analyzer and Target Designer, Admin console and Data Administration.
  • Experience in designing complex mappings using Source Qualifier, Update Strategy, Expression, Sequence Generator, Aggregator, Filter, Joiner, JAVA, Router, Connected Lookup, Unconnected Lookup, Normalizer, Rank, XML, Sorter, Transaction Control and Union to load data into different target types from different sources like Flat files (Delimited files and Fixed width files) like text files or CSV files, XML, Oracle, IBM Mainframe, Cobol files, SQL server, DB2, Teradata, Netezza.
  • Experience in managing Slowly Changing Dimensions (SCD) Type 1, Type 2 and Type 3, Data Marts, OLAP, OLTP and Change Data Capture (CDC).
  • Experience in using BI tools like SSIS and converting the SSIS packages to Informatica mappings.
  • Worked on Workflow Manager by creating Sessions/tasks, Workflows, Worklets, Reusable Tasks like Sessions, Command, Email and Non - Reusable tasks like Decision, Event Wait and Event Rise.
  • Good at Pre-SQL and Post-SQL at session level in Informatica.
  • Knowledge on Data Cleansing, Data Staging, Data profiling, Error Handling, Session log files, Workflow log files and Performance optimization like Pushdown Optimization (PDO), session partitioning, which troubleshoots the bottleneck problems at different levels like source, target, mapping, session and system.
  • Worked on different databases like Teradata, Oracle, DB2, Sybase, SQL Server, MS Access.
  • Good Knowledge of Data Warehousing/Data Modeling using ERwin, staging tables, stored procedures, Functions, cursors, Dimension tables, Fact tables, Surrogate Key, Primary keys, Foreign Keys, Star Schema, Snow Flake Schema, Triggers and Normalization/Denormalization.
  • Experienced in performance tuning like creating indexes at database level.
  • Experience in Scheduling Tools like Tidal, Autosys, Control-M, and Workload manager.
  • Worked with versioning tools like Clearcase, SVN Tortoise, PC Based version control
  • Good Knowledge on Teradata utilities like Multi Load, FastLoad, TPump, FastExport, and BTEQ scripts.
  • Good Knowledge of writing PL/SQL procedures.
  • Experience in using Oracle Development tools such as Tool for Oracle Application Development (TOAD).
  • Good Knowledge on Oracle utilities like SQL Loader, SQL Developer.
  • Hands on experience in writing Simple/Complex SQL Queries sing Sub queries and multiple table joins using left, right, inner joins.
  • Experience in UNIX Shell Scripting and batch scripting for parsing files.
  • Experience in using the Informatica command line utilities like PMCMD, PMREP to execute workflows.
  • Good communication skills both verbal and written, hardworking, self-motivated, ability to work independently or co-operatively in a team, eager to learn and ability to grasp quickly.
  • Good experience in ETL technical documentation.

TECHNICAL SKILLS:

ETL: Informatica PowerCenter 9.5.1/9.1.1/8. x/7.x, Talend 6.4.1

Databases: Oracle 11g/10g, SQL Server 2012/2008/R2/2003/R2, DB2/UDB, MS Access, Teradata V13/V2R12, Sybase

Methodologies: Star schema and Snowflake schema

Languages: SQL, PL/SQL, C and data structures, C++, JAVA, UNIX shell scripting, Batch scripting, Perl Scripting

Operating system: UNIX, Windows Server, Linux

Tools: Putty, WinSCP, Toad, AQT, Autosys, Control-M, Workload Manager, Clearcase, SVN, PC Based version control, ERwin, Tidal

Web Design: PHP, HTML

Utilities: SQL Loader, Multi Load, Fast Load, Fast Extract, TPump, BTEQ, PMCMD, PMREP, SQL Assistant

PROFESSIONAL EXPERIENCE:

Confidential, Concord, NH

ETL Developer

Responsibilities:

  • Worked on multiple projects simultaneously while interacting closely with the on shore and off shore teams.
  • Responsible for integrating customer data from Zoho CRM portal to Prospero Database and from the database to the Zoho CRM portal using Talend ETL components and jobs for Jamaican Stock and Securities project.
  • Responsible in converting the Lead customers to Account customers, Actor customers, Portfolio customers and Additional Actors.
  • Responsible in integrating the customer documents to the specific actor.
  • Responsible in designing in dataware house by extracting the data from the oracle tables and from the CRM portal.
  • Responsible for taking the backup/recovery of the server.
  • Extracted and transformed the data from various sources such as Oracle, Flat files, XML files and loaded into the SQL Server data warehouse using Informatica as the ETL Tool for the Emcare project.
  • Responsible in cleansing the data before loading into the database.
  • Created stored procedures to fetch data from multiple tables using joins, sub-queries and views.
  • Extracted the data from the Access Microsoft DB and converted the SQL scripts into MYSQL version for the RMI Logistics Migration Project.
  • Responsible for creating unit test plans using different scenarios.
  • Used BIME tool to create customer reports.

Environment: Windows 7, Windows Scheduling Tool, Informatica 9.5.1, Talend 6.4.1, Prospero Oracle 11g, SQL Server 2012, MySQL 5.0, AWS.

Confidential, St. Louis, MO

ETL Database Analyst

Responsibilities:

  • Interacted closely with data analysis and development teams to develop requirements for Data Quality and file delivery offerings.
  • Worked with off-shore teams in India, Philippines and other parts of USA to run the batch scripts to maintain production jobs.
  • Troubleshoot and fix Data Quality issues on both Oracle (ODS-Operational Data Store) and DB2 (BDW- Brokerage Data Warehouse) platforms using Informatica, Autosys and Unix Shell scripts.
  • Analyzed the data based on requirements, wrote down the techno-functional documentations and developed complex mappings using Informatica Data Quality (IDQ) 9.6.1
  • Provide guidance to software development teams regarding production supportability, coding and deployment standards, adherence to security policies etc.
  • Responsible for granting/revoking the access for the users and deployment of the scripts to the production server.
  • Responsible for installation, patching and upgrades of the server
  • Provided technical support for database access control, job execution and other database maintenance tasks.
  • Responsible for file movement in and out of the Confidential Advisor’s distribution data environment using CFD (Central File Distribution).
  • Performed file layout change and created supporting documents such as design document, impact analysis document and transition guide.
  • Created Unit test plans and did unit testing using different scenarios separately for every process in the development phase.
  • Provide on-call support to production system to resolve any issues.

Environment: Oracle 11g, DB2 (UDB) 10.1(fp5), Informatica 9.6.1(HF 4), Unix-AIX, Autosys.

Confidential . Norfolk, VA

Informatica Developer

Responsibilities:

  • Interacted with Business Analyst to understand the requirements and translated them into appropriate technical requirement document.
  • Coordinated with the client team on daily basis for issues and status update.
  • Conducted technical design presentation to the client and getting the sign off.
  • Designed mappings using different transformations like Lookup’s (connected, Unconnected, Dynamic), Expression, Sorter, Joiner, Router, Union, Transaction Control, Update strategy, Normalizer, Filter, Rank, and Aggregator using Informatica Power center designer.
  • Analyzed different data sources like Oracle, Flat files including Delimited and Fixed width like text files from which the vendor data is coming and loaded into SQL Server.
  • Worked on Slowly Changing Dimension (SCD) Type 1 and Type 2 to maintain the full history of customers.
  • Worked on creating new database (BHAS) by pulling the data from FACETS and ODW and responsible for creating the facts and dimensions.
  • Worked closely with an architect and data modeler in designing and implementing ER models and dimensional models.
  • Worked on migration of Informatica power center 9.5.1 to 9.6 version.
  • Worked on migration of Informatica power center from 9.1 to 9.5.1 version.
  • Responsible for performance optimization by writing SQL overrides instead of using transformations, implementing active transformation like filter as early as possible in the mapping, selecting sorted input when using Aggregator or Joiner transformations.
  • Implemented performance tuning at all levels like source level, target level, mapping level and session level.
  • Created and configured workflows, Worklets and sessions using Informatica Workflow Manager.
  • Created command task at the workflow level by writing the commands to get the Flat files having same structure i.e. Indirect File Loading and at the end of the task moving all the Flat files using command task to an archive directory.
  • Used workflow level variables for the reusability of the code.
  • Used Mapping Parameters and Variables to implement Object Orientation techniques and facilitate the reusability of code.
  • Worked with SQL sub queries and joins between multiple tables.
  • Worked on stored procedures for an incremental load.
  • Worked on Design Error Handling process in ETL.
  • Created reusable tasks at workflow level.
  • Created reusable transformations at mapping level in power center designer.
  • Created Tidal Workflow Job streams for each subject area and scheduled as per the business requirements.
  • Closely moved with the Enterprise reporting team and helped them to get the data for creating report.
  • Have performed code promotion from development level to production level.
  • Worked on waterfall methodology.
  • Prepared an ETL technical document maintaining the naming standards.

Environment: Informatica PowerCenter 9.5.1, Oracle 11g, Tidal, SQL server 2012, SQL server 2008, TOAD, Windows 2012, ERwin9.64.

Confidential, LA, CA

ETL/Informatica Developer

Responsibilities:

  • Interacted with Business Analyst to understand the requirements and translated them into appropriate technical requirement document.
  • Coordinating with the client team on daily basis for issues and status update.
  • Conducting technical design presentation to the client and getting the sign off.
  • Designed mappings using different transformations like Lookup’s (connected, Unconnected, Dynamic), Expression, Sorter, Joiner, Router, Union, Transaction Control, Update strategy, Normalizer, Filter, Rank, and Aggregator using Informatica Power center designer.
  • Analyzed different data sources like Oracle, Flat files including Delimited and Fixed width like text files, XML files from which the contract data and billing data is coming and understand the relationships by analyzing the OLTP Sources and loaded into Teradata warehouse.
  • Worked on Slowly Changing Dimension (SCD) Type 1 and Type 2 to maintain the full history of customers.
  • Responsible for performance optimization by writing SQL overrides instead of using transformations, implementing active transformation like filter as early as possible in the mapping, selecting sorted input when using Aggregator or Joiner transformations.
  • Implemented performance tuning at all levels like source level, target level, mapping level and session level.
  • Created and configured workflows, Worklets and sessions using Informatica Workflow Manager.
  • Created command task at the workflow level by writing the commands to get the Flat files having same structure i.e. Indirect File Loading and at the end of the task moving all the Flat files using command task to an archive directory.
  • Used workflow level variables for the reusability of the code.
  • Used Mapping Parameters and Variables to implement Object Orientation techniques and facilitate the reusability of code.
  • Worked on Informatica B2B DT/DX tool for XML files.
  • Worked on stored procedures.
  • Worked on High availability option in power center
  • Worked with SQL sub queries and joins between multiple tables.
  • Worked on Design Error Handling process in ETL.
  • Worked on migration of SSIS packages to Informatica mappings.
  • Created reusable tasks at workflow level.
  • Created reusable transformations at mapping level in power center designer.
  • Worked with version control tool using clearcase.
  • Scheduled job using Control-M.
  • Worked with Oracle tool like TOAD.
  • Worked with Teradata utilities like Fast-load, Tpump.
  • Closely moved with the Micro strategy reporting team and helped them to get the data for creating report.
  • Have performed code promotion from development level to production level.
  • Worked on agile methodology.
  • Prepared an ETL technical document maintaining the naming standards.

Environment: Informatica PowerCenter 9.5.1, Oracle 11g, Teradata v13, Teradata utilities, Windows server 2012, PL/SQL.

Confidential

ETL Developer

Responsibilities:

  • Worked with data analyst to know the requirements of the project.
  • Worked with various data sources like oracle, flat files, Sybase data coming as customers data and load into Oracle warehouse.
  • Involved in designing of Informatica mappings by using transformations like source qualifier, expression, update strategy, router, joiner, connected lookup, unconnected lookup, dynamic lookup, transaction control, rank, sequence generator, Aggregator.
  • Implemented Slowly Changing Dimensions (SCD) Type 1 and Type2.
  • Created reusable transformations and Mapplets and used them in mappings.
  • Worked on shared folder.
  • Worked on oracle utility, scheduling tool and versioning tool.
  • Created PL/SQL procedures for certain Business Requirements
  • Designed SQL overrides in source Qualifier according to business requirements
  • Created sessions and workflows for designed mappings and created Sequential Workflow using
  • Monitoring the session logs to check the progress of data load.
  • Supported testing team, UAT team and production team.
  • Worked on ETL document.

Environment: Oracle 10g, InformaticaPowerCenter7.1, Sybase, Toad, SQL loader, Autosys, Control-M, Perl scripts, Windows server 2003/R2.

Hire Now