Sr. Etl Developer Resume
Chandler, AZ
SUMMARY
- Senior Informatica Developer having 13 years of IT experience with strong background in ETL Data warehousing using Informatica PowerCenter 10.x/9.x/8.x Experience in Planning, Designing Developing and Implementing Data warehouses/ DataMart’s with experience of both relational & multidimensional database design.
- Experience in using Informatica PowerCenter 10.2/10.1/9.6.1 /8. x/7.x - Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Manager, Workflow Monitor, Repository Manager and Admin Console.
- Extensive experience in developing complex mappings from varied transformations like Source Qualifier, Connected and Unconnected lookups, Router, Filter, Sorter, Normalizer, Expression, Aggregator, Joiner, Union, Update Strategy, Stored Procedure and Sequence Generator etc.
- Expertise in design and implementation of SCD - slowly changing dimensions types (1, 2 and 3) and CDC - Change data capture.
- Experienced in loading data, troubleshooting, debugging mappings, performance tuning of Informatica (Sources, Targets, Mappings and Sessions) and fine-tuned Transformations to make them more efficient in terms of session performance.
- Experience in implementing the complex business rules by creating re-usable transformations, developing complex Mapplets and Mappings.
- Instrumental in setting up standard ETL Naming standards & BEST Practices throughout the ETL process (Transformations, sessions, workflow names, log files, input, variable, output ports).
- Database experience using Teradata 16.10/15.10/14 , PostgreSQL, H2 Database, Oracle Exadata/12c/11g/10g/9i, MS SQL Server 2014/2008, AS400, DB2 and MS Access. Also used SQL Editors such as Teradata SQL Assistant, H2 Console, TOAD, SQL PLUS and SQL Analyzer.
- Experience on using Teradata Parallel Transporter (TPT) load protocols like LOAD, UPDATE STREAM & EXPORT.
- Experience in using Teradata Macros, Query banding and BTEQ Scripts on Linux/Unix.
- Strong experience using T-SQL, PL/SQL Procedures/Functions, Triggers and Packages.
- Experience in Unix/Linux Operating System and Shell scripting.
- Experience in integration of various data sources like Teradata, Oracle, PostgreSQL, S3, HDFS, MS SQL Server, Flat Files, and XML Definitions.
- Good understanding of Views, Synonyms, Indexes, Joins, and Sub-Queries.
- Working knowledge of data warehouse techniques and practices, experience including ETL processes, dimensional data modeling (Star Schema, Snow Flake Schema, FACT & Dimension Tables), OLAP.
TECHNICAL SKILLS
- Informatica PowerCenter 10.2/10.1/9.6.1 /9.5/9.1/8. x
- Admin Console
- Informatica Developer
- OWB10g
- Apache Spark
- Azkaban
- GitHub
- PostgreSQL
- Amazon S3
- HDFS
- Teradata 13.10/14/14.10/15.10/16.10
- Oracle Exadata/12c/11g/10g/9i/8i
- SQL Server 2014/2008/2005
- MS Access
- DB2
- AS400
- H2 Database
- SQL, PL/SQL
- C, C++, Data Structures
- Unix Shell Script
- Teradata SQL Assistant
- H2 Console
- SQL plus
- PL/SQL Developer
- Toad
- SQL* Loader
- Sqoop
- Windows Server 2003/2008
- UNIX/MS-DOS/Linux
PROFESSIONAL EXPERIENCE
Sr. ETL Developer
Confidential, Chandler, AZ
Responsibilities:
- Used metadata driven ETL framework to generate XML’s and for batch execution control.
- Designed, developed, implemented ETL process to extract, transform, and load (ETL) data from inbound flat files and various source systems and loaded into Data Mart using the Informatica PowerCenter.
- Developed mappings to extract data from SQL Server, Oracle, Teradata, SFDC, SIEBEL, Flat files to load into Teradata using the Informatica PowerCenter.
- Worked extensively on different types of transformations like Source qualifier, expression, Aggregator, Router, filter, update strategy, Connected and Un-connected lookup, sorter, Normalizer, SQL transformation and sequence generator.
- Worked on Infrastructure setup for building new applications.
- Designed and developed process to handle very high volumes of data using Teradata Parallel Transporter (TPT) load protocols like LOAD and UPDATE.
- Expertise in writing BTEQ scripts on Linux. Used BTEQ scripts in pre/post load processes for inserting/updating the process activity tables. Also used Teradata Macros and Query banding.
- Used Informatica Push Down Optimization (PDO) to push the transformation processing from the PowerCenter engine into the relational database to improve performance.
- Written shell and perl scripts to execute the workflows, cleanup the files, execute BTEQ Scripts and for other batch processings.
- Used Teradata Viewpoint to monitor system usage and query performance and skew/spool status.
- Used Autosys Scheduler to Create, Schedule and control the batch jobs.
- Used Network Data Mover to receive/transfer files across multiple souce systems.
- Assisted and worked on performance testing, data quality assessment & production deployments.
- Involved in jobs scheduling, monitoring and production support in a 24/7 environment.
- Used Application Lifecycle Management tool to manage and track Application lifecycle and defects.
- Used JIRA for issue tracking; UrbanCode Deploy as part of Automation process to deploy Unix related components.
Environment: Informatica Power Center 10.2/10.1/9.6.1 , Teradata 16.10/15.10/14 .10, Oracle Exadata/12c/11g, Teradata SQL Assistant, Teradata Viewpoint, Autosys, NDM, DTS, SQL, Perl, UNIX Shell Scripts, Linux, TortoiseSVN, GitHub, ALM, JIRA, UrbanCode Deploy, Jenkins
Sr. ETL Developer
Confidential, Norfolk, VA
Responsibilities:
- Worked with Business Analysts and Business Owners in gathering requirements and designing Logical/Physical Data Models.
- Developed PowerCenter mappings to extract data from various databases and generate Flat files using the Informatica 9.6.1.
- Used Enterprise Tidal Scheduler to Create, Schedule and control the batch jobs and the Informatica workflows.
- Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression, Lookup, Update strategy and Sequence generator, Stored Procedure.
- Used Automated testing utility to migrate Informatica XML, SQL codes to QA environment.
- Created documents to describe program development, logic, coding, testing, changes and corrections.
- Created Technical Design Document and worked with operations team to resolve production issues.
- Written Batch Scripts to move Target Data Files to Outbound and Archive locations and to SFTP the files to Vendors.
- Assisted and worked on performance testing, data quality assessment & production deployments.
- Involved in jobs monitoring and production support in a 24/7 environment.
Environment: Informatica Power Center 9.5.1/9.6.1 , SQL Server 2014, Teradata SQL Assistant, Oracle 11g, Toad, Tidal, SFTP, SQL, Batch Scripts, TortoiseSVN, Enterprise Jira, Confluence, Agile
Sr. Informatica Developer
Confidential, Glendale, CA
Responsibilities:
- Created functional requirements documents based on business requirements
- Created Internal Wiki documents with information about the process design and production support steps.
- Design, develop/modify ETL mappings based on technical specifications
- Perform unit and integration testing
- Design and develop ETL Load Audit Control Architecture and processes for reporting and monitoring
- Designed/Developed a framework to enable the reusability of tables and sessions across the project.
- Used Confluence as Orginazational Wiki to discuss work, upload process documents, etc..
- Created Sqoop scripts to ingest data from HDFS to Teradata and from SQL Server to HDFS and to PostgreSQL.
- Used Azkaban scheduler to run, monitor Hadoop Jobs.
- Used H2 Console to access H2 database and other databases using JDBC drivers.
- Used Amazon S3 as storage for all the data.
- Used Apache Spark to connect to various API and to ingest data.
- Upgraded Informatica from 9.5.1 to 9.6.1 on Windows server and client.
- Created Batch scripts to execute workflows, schedule/unschedule jobs, import/export users to Admin console, FTP scripts, BTEQ and TPT scripts.
- Involved in Informatica server configurations, PowerCenter server/client and Power Exchange Installations, setting up Admin Console, Repository migrations and Database backup/restores.
- Assist with day-to-day operational tasks related to team responsibilities, and provide both ongoing and off-hour support and maintenance on ETL / DW solutions
Environment: Informatica Power Center 9.5.1/9.6.1 , Teradata 14.10, SAP, SQL Server 2008, Teradata SQL Assistant, Rally Agile tool, Informatica Scheduler, Globalscape, SQL, Batch Scripts, Windows server 2008/2003, GitHub, Amazon S3, Sqoop, Azkaban Scheduler, H2 Console, Spark, HDFS, Shell Scripts, Confluence .
Sr. ETL Developer
Confidential, Tempe, AZ
Responsibilities:
- Based on the business requirements, created Functional design documents and Technical design specification documents for ETL Process.
- Designed, developed, implemented ETL process to extract, transform, and load (ETL) data from inbound flat files and various source systems and load into Data Mart using the Informatica PowerCenter.
- Developed mappings to extract data from SQL Server, Oracle, Flat files and load into Oracle using the PowerCenter.
- Used SQL Loader to load data from flat files to the database tables in Oracle.
- Created database Tables, Indexes, Views, Materialized Views, Stored Procedures and Packages to support backend development.
- Created complex mappings and reusable transformations. Made use of mapping variables, mapping parameters.
- Worked extensively on different types of transformations like Source qualifier, expression, Aggregator, Router, filter, update strategy, Connected and Un-connected lookup, sorter, Normalizer, SQL transformation and sequence generator.
- Created workflows, worklets and used all the other tasks like email, command, decision, event wait, event raise and assignment tasks in the Workflow Manager.
- Tuned performance of Informatica sessions for large data set by increasing block size, data cache size, sequence buffer length, target based commit interval and session partitioning.
- Used Autosys to Create, Schedule and control the batch jobs .
- Ensured standards and best practices used in the code development are followed, documented and maintained.
- Assisted and worked on performance testing, data quality assessment & production deployments.
- Involved in jobs scheduling, monitoring and production support in a 24/7 environment.
Environment: Informatica Power Center 9.6.1/9.1, Oracle 1 1g, TOAD, SQL Loader, Autosys, NDM, SQL, PL/SQL, UNIX Shell Scripts, Linux/Solaris, TortoiseSVN, ALM.