We provide IT Staff Augmentation Services!

Sr. Etl Developer Resume

2.00/5 (Submit Your Rating)

Spring House, PA

SUMMARY:

  • Over 9 years of experience in information technology in analysis, design, development, testing, maintenance and enhancement in data warehousing, data integration and client - server environment in government services, healthcare, banking, insurance, manufacturing and sales systems.
  • Over seven years of strong data warehousing experience in business requirement analysis, design, development, testing, implementations, loading, maintenance and enhancements of data warehouse and ODS systems using Informatica 9.x/8.x/7.x/6.x, Oracle 11g/10g/9i, DB2, SQL Server, Teradata, PL/SQL and well versed with data warehousing architecture, technology, operational data store and dimensional data modeling.
  • Extensive knowledge with dimensional data modeling, star schema/snowflakes schema, fact and dimension tables.
  • Heavily interacted with the source system owners and business users to gather requirements, design tables, standardize interfaces for receiving data from multiple operational sources, coming up with load strategies of loading staging area and data warehouse.
  • Worked extensively with different sources and target systems like XML, flat file, relational tables and message queues.
  • Excellent exposure with Informatica Administrator console, including creating and restoring backups/contents, recycling the services(repository and integration services), maintaining the log files, creating users, groups, roles, folders and assigning privileges and permissions, creating label and deployment groups for object migrations.
  • Hands on experience on Teradata V2R5, V2R6 and utilities like Multi Load, Fast Load, TPUMP, FastExport, SQL Assistant, BTEQ.
  • Used features like pre-session/post-session commands/SQL, fine-tuned databases, mappings, and sessions to achieve optimal performance.
  • Implemented Change Data Capture (CDC) techniques.
  • Experienced working on Informatica DT studio. Well versed in creating mapper, parser, serializer and integrating the code with power center.
  • Implemented slowly changing dimensions (type1, type2 and type3), slowly growing targets, and simple pass through mapping’s using Power Center.
  • Experience in creating mapplets in IDQ/ Informatica Developer for data standardization.
  • Working knowledge on Informatica MDM tool set.
  • Experienced working with Unix Shell scripts.
  • Worked on Sales force as source using Power Exchange.
  • Worked on Web services to pull the data from.
  • Worked on Production Support for my projects.
  • Worked on Hadoop and its components like HDFS map-reduce, Pig, Hive and Hbase.

SKILL:

Data Warehousing: Informatica Power Center 9.6.1/9.5.1/9.1/8. x, Informatica Data Transformation Studio, Informatica MDM, Power Mart 7.0/6.2/5.1.1/5.0 Power Connect, Power Analyzer ETL, Informatica Power exchange 8.1.1, Ralph Kimball warehouse methodology, Cognos 7.0/6.0, SQL*Loader, Toad, Appworx job scheduler, Tidal scheduler, Autosys.

BI: Cognos 7.0/6.0 (Impromptu, Power Play, Transformer), Spotfire

Dimensional Modeling: ERWIN 4.x/3.5.2/3.x, Dimensional Data Modeling, Star Schema Modeling, Snow-Flake Modeling, Physical and Logical Modeling, Oracle Designer 9i/6i.

Databases: Oracle 11i ERP, Teradata15.0, Oracle 11g/10g/9i/8i/8.0,PL/SQL, Sybase, SQL Server 11.0, Mongo DB, Netezza, MS SQL Server 7.0/2000, MS Access 7.0/97/2000, DB2, Teradata BTeq Scripting.

Servers: Oracle 9iAS, IIS, Apache, Linux, Unix.

Languages: Hadoop, SQL, PL/SQL, Java, C, C++, Perl, XML, COBOL, UNIX Shell Scripting.

OS: Win NT/XP/2000/98/95, UNIX (Sun Solaris 2.7/2.6, HP-UX 11.x), Microsoft SQL server Integration services.

EXPERIENCE:

Confidential, Spring House, PA

Sr. ETL Developer

Responsibilities:

  • Involved in requirement gathering for core tables.
  • Worked with Source System SME’s to gather the required information from Source systems.
  • Developed ETL stage mappings to pull the data from the source system to the staging area.
  • Worked with Data Architect in updating the Erwin data modeler with new table creation with proper indexes.
  • Created Teradata views to read data from source system and Load and Lookup tables. Which use Union function to fetch data from Source system based on given business conditions.
  • During the process I have used Informatica Power center tools to populate and schedule the jobs.
  • Developed reusable transformations and mapplets which can be used for multiple mappings.
  • Used Teradata and Teradata utilities to load the data into the data marts.
  • Used Teradata parallel transporter, configured sessions to use the TPT.
  • Created Teradata Bteq scripts to load data into Core tables for more efficiency and reduce load time.
  • Developed UNIX shell scripts for easy use of files during Informatica process.
  • Developed error tables and audit table for loading bad records.
  • Migrated informatica programs from one environment to another environment.
  • Worked on production support for my project and fix any issues accordingly.
  • Performance tuned the long running informatica and SQL jobs.

Environment: Informatica Power Center 9.6.1/9.1, Informatica Developer 9.6.1, Informatica Data Transformation, Hadoop Hbase, Hive, Oracle 11g, MS SQL Server, Teradata 15.0.5, Spotfire, flat files, SQL Loader, CSV files, SQL,PL/SQL, FileZilla, CoreFtp, Toad data point, Erwin 4.0/3.5.2, Unix Shell Scripting, Tidal scheduler.

Confidential, Mansfield, MA

Sr. ETL Developer

Responsibilities:

  • Involved in Tableau dashboard requirements gathering.
  • Understand the different types of sources involved and relationship between the tables.
  • Developed ETLstage mappings to pull the data from the source system to the staging area.
  • Designed the target tables and the relationships between the tables.
  • Developed Informatica mappings based on the source to target matrix document.
  • During the process I have used Informatica Power center tools to populate and schedule the jobs.
  • Developed reusable transformations and mapplets which can be used for multiple mappings.
  • Used Teradata and Teradata utilities to load the data into the data marts.
  • Developed UNIX shell scripts for easy use of files during Informatica process.
  • Developed error tables and audit table for loading bad records.
  • Migrated informatica programs from one environment to another environment.
  • Worked on production support for my project and fix any issues accordingly.
  • Performance tuned the long running informatica and SQL jobs.

Environment: Informatica Power Center 9.5.1/9.1, Informatica Developer, Oracle 10g, MS SQL Server, Teradata, flat files, SQL Loader, CSV files, SQL,PL/SQL, Decision stream, FileZilla, Toad data point, Erwin 4.0/3.5.2, Unix Shell Scripting.

Confidential, Danville, PA

Sr. ETL Developer

Responsibilities:

  • Worked closely with the business analyst to gather the requirements on how to handle the source data and design accordingly the ETL process.
  • Worked closely with the data architect on the data model for building the pure wellness model.
  • Developed ETLstage mappings to pull the data from the source system to the staging area.
  • Developed EDW mappings based on the mapping document.
  • Coded reusable transformations and mapplets which can be used for multiple mappings.
  • Written complex SQL queries using sub queries to pull the data from the database.
  • Worked on Informatica Data transformation (B2B) for one of the hub project converting the PDF, EDI X12 files and HL7 documents into XML and used the Deployed. Called the same services in Power center using unstructured data transformation to load the data into the tables.
  • Worked on Informatica Developer tool to profile the data with custom queries.
  • Worked on Informatica data quality (IDQ) to monitor the data and cleanse across the enterprise to achieve master record for standardization.
  • Worked on Informatica data explorer (IDE) for profiling the data.
  • Worked on Teradata and Teradata utilities to load the data into the data marts.
  • Created Teradata Bteq scripts to load the foundation tables.
  • Created UNIX scripts for requirement purposes and for automation purposes.
  • Closely worked with the QA team to fix any mapping related issues.
  • Used error handling strategies for trapping errors in a mapping which sent errors to an error table.
  • Migrated informatica programs from one environment to another environment.
  • Worked on production support for my project and fix any issues accordingly.
  • Based on the requirement written UNIX scripts to FTP the files from one server to another server.
  • Actively involved in Informatica Administration activities like creating, changing permissions and deleting folders and removed users.
  • Performance tuned the long running informatica and SQL jobs.

Environment: Informatica Power Center 9.5.1/9.1, Informatica Developer, Informatica Data Transformation Studio, Oracle 10g, MS SQL Server, Teradata, Teradata Bteq Scripting, Informatica Power Exchange 9.1.1, Informatica Meta data Manager, flat files, SQL Loader, CSV files, SQL,PL/SQL, FileZilla, Toad 9.1.2, Erwin 4.0/3.5.2, XML Sources, Unix Shell Scripting.

Confidential, Nashville, TN

Sr. ETL Developer

Responsibilities:

  • Involved in direct conversations with the business users and gathered the requirements on how to handle the source data and design accordingly the ETL process.
  • Developed mappings for the source (cache database) to stage (Oracle) and from stage (Oracle) to enterprise data warehouse (Oracle).
  • Used normalizer, union, sorter, aggregator, and router transformations for doing the data transformations using informatica power center 8.6.
  • Used Oracle SQL developer to access data for the source cache database, Oracle.
  • Used informatica debugger to debug the data in the transformations used in the ETL process.
  • Performed join operations and written subqueries on the ETL stage area where all the source systems where landed.
  • Used informatica’s features to implement type 1 changes in slowly changing dimension tables.
  • Used error handling strategies for trapping errors in a mapping and sending errors to an error table.
  • Handled informatica mapping specification documents for the mappings developed and documented them according to business standards.
  • Migrated informatica programs from one environment to another environments.
  • Actively involved in informatica administration activities like creating, changing permissions and deleting folders and removing users.
  • Used deployment group’s option in the informatica repository manager, to migrate the code from one repository to another repository.
  • Wrote SQL statements, PL/SQL (stored procedures) to retrieve easily, updated data on a given database.
  • Wrote UNIX shell scripts to automate informatica jobs and/workflows on the informatica server.
  • Created batch processes for the informatica objects like sessions, worklets and workflows.
  • Scheduled Informatica workflows using Informatica scheduling tools (autosys).
  • Actively involved in business analyst meetings to gather the requirements.
  • Fixed any issue and debugged them which arise during production support.
  • Excellent interpersonal, communication and presentation skills.
  • Strong knowledge of data warehousing concepts and dimensional modeling like star schema and snowflake schema.

Environment: Informatica Power Center 9.1.1, 8.6.1, Oracle 10g, Windows XP, Informatica Power Exchange 9.1.1,Informatica Meta data Manager, Vertica, flat files, SAP, SQL Loader, CSV files, SQL,PL/SQL, FileZilla, Toad 9.1.2, Erwin 4.0/3.5.2, windows NT, MS Office, XML Sources, Unix Shell Scripting.

Confidential, Washington, DC

Sr. ETL Developer

Responsibilities:

  • Extensively used informatica client tools-designer, workflow manager, workflow monitor, and repository manager.
  • Actively interacted with the data modelers, data analysts and business users to gather verify and validate various business requirements.
  • Involved in creating mappings using mapping and session parameters in the mappings and workflows to eliminate hard coding wherever possible.
  • Created worklets, and made use of the shared folder concept using shortcuts wherever possible to avoid redundancy.
  • Involved in creating static and dynamic parameter files.
  • Worked extensively on Informatica transformations like Source Qualifier, Expression, Filter, Router, Aggregator, Lookup, Update Strategy, Stored Procedure, Sequence generator, Joiner, Union, and Normalizer.
  • Used Debugger to monitor data movement and troubleshoot the mappings.
  • Developed test cases for unit testing of the mappings, and also was involved in the integration testing.
  • Effectively documented all development work done.
  • Actively involved in business analyst meetings to gather the requirements.
  • Worked as a test developer. Wrote complex SQL queries using oracle function to test the data loaded into the target tables matching the source and the target count based on the business requirements.
  • Raised the defects when the data miss matched or the data loaded into the target was not according to the bussing logic for all the fields.

Environment: Informatica Power Center 8.6.1, Oracle 10g, Windows XP, Informatica Power Exchange 8.1.1, flat files, SQL Loader, SQL,PL/SQL, Reflection, FileZilla, Toad 9.1.2, Erwin 4.0/3.5.2, HP quality control, windows NT, MS Office, XML Sources, Unix Shell scripting.

Confidential, East Lansing, MI

Sr. ETL Developer

Responsibilities:

  • Translated the requirements into functional and technical specifications.
  • Worked on data mart projects starting from writing technical specifications, developed mappings and workflows, testing and migrating them to development and production.
  • Extensively used informatica designer to create and manipulate source and target definitions, mappings, mapplets, transformations, and re-usable transformations, etc.
  • Extracted data from Teradata, Oracle, Flat Files and XML and loaded them into Teradata data marts.
  • Worked on Teradata and Teradata utilities to load the data into the data marts.
  • Worked on Teradata MPP (massive parallel processing) and performance tune the long running jobs in Teradata with the understanding of Teradata architecture.
  • Involved in the ETL process to migrate the repository from development to testing and production environments.
  • Field level validations like data cleansing and data merging were applied on various interfaces for transforming into FOCUS values.
  • Mappings for data flow from the staging to the ERP and ERP to staging and DW following the ERP posting process.
  • Created various transformations like Joiner, Aggregate, Expression, Lookup, Filter, Update Strategy, Stored Procedures, and Router etc. and fine-tuned the mappings for optimal performance.
  • Involved in performance tuning of SQL queries, sources, targets and informatica sessions for large data filed by increasing block size, data cache size, sequence buffer length, and target based commit interval.
  • Created and used debugger sessions to debug sessions and created breakpoints for better analysis of mappings.
  • Used UNIX scripts to validate and audits.
  • Used error routines and exceptions handling techniques as per business rules.
  • Used pmcmd, pmrepagent and pmrepserver in non-windows environment.
  • Wrote detailed technical documentation in tune with the ETL standards.
  • Created UNIX shell scripting and automation of ETL processes using Maestro Scheduler.
  • Testing was performed on my mappings at ETL standards.

Environment: Informatica Power Center 7.1.2, Power Mart, Oracle 9i, Teradata v2 R5, DB2, XML, Flat files, Teradata Mload, SQL *Plus, IMS Data, TOAD 7.5, Erwin 4.0, Sun Solaris 5.8, MS SQL Server 2000, Unix Shell Scripting, Windows XP, Maestro, MS-Excel.

Confidential, Irvine, CA

ETL Developer

Responsibilities:

  • Involved in the ETL architectural design and the preparation of the technical specs based on the business requirements and high interaction with the business users and also lead the offshore team.
  • Proper guidance was given to the team to follow the coding standards and delivered the work on time in a systematic manner.
  • Created complex mappings, mapplets using the advanced transformations like Filter, Router, Stored Procedure, Update, and Normalize to implement the logic involved to do CDC.
  • Used largely informatica power exchange 8.1.1 to do the Change Data Capture (CDC) to capture the daily changes in the AS400 system.
  • Stored procedures and functions were created and modified in Oracle to call it from informatica.
  • Configured sessions for pushdown optimization to improve the performance.
  • Ran the sessions with error handling strategies and also used the collect performance data, verbose to identify the errors in the mappings.
  • Did the unit testing and also was mainly involved in the integration testing.
  • Created the deployment groups and migrated in the migration of the coding into different repositories.
  • Prepared the documentation templates for the team and prepared the documentation for my coding.
  • Using SQL, PL/SQL wrote queries to understand the data and get the required development work done.
  • Used Ralph Kimball warehouse methodology, Star and Snowflake schema.
  • Used informatica command, email, and Event Wait and Event Raise tasks to schedule the informatica workflows and sent emails to the support groups on successful /failure completion of the workflows.

Environment: Informatica power center 8.1, power exchange 8.1.1, Power Mart 7.0,Oracle 10g, DB2, SQL Server 2005,Teradata, flat files, SQL Loader, Ralph Kimball warehouse methodology, HTML 4.0, CSV files, SQL,PL/SQL, MS SQL Server, Toad 9.1.2, Erwin 4.0/3.5.2, Clear Case, BTEQ, windows NT, MS Office, XML Sources, Unix Shell Scripting.

We'd love your feedback!