We provide IT Staff Augmentation Services!

Sr. Etl Developer Resume

2.00/5 (Submit Your Rating)

Lowell, AR

SUMMARY:

  • Over 9+ years of IT experience on ETL development, Data warehouse & Business intelligence technologies, and design.
  • Excellent working experience & sound knowledge on Informatica and Talend ETL tool. Expertise in reusability, parameterization, workflow design, designing and developing ETL mappings and scripts.
  • Strong proficiency in writing SQL (including Oracle, Teradata, SQL Server, Netezza, DB2 ).
  • 7+ years of good exposure on optimizing the SQL and performance tuning .
  • Excellent knowledge on Business Intelligence tools such as SAP Business objects & Microstrategy .
  • Solid understanding of both technical and functional data quality concepts.
  • Excellent experience on Oracle, SQL server, DB2 UDB, Netezza & Teradata.
  • Experience in developing analytics, dashboards, ad - hoc reports, published reports, static reports through SAP BO.
  • Good understand the ETL specifications and build the ETL applications like Mappings on daily basis
  • Expertise in UNIX shell scripting
  • Extensive Knowledge of RDBMS concepts, PL/SQL, Stored Procedure and Normal Forms.
  • Expertise in setting up load strategy, dynamically passing the parameters to mappings and workflows in Informatica & workflows & data flows in SAP Business Objects data services integration tools.
  • Experience in developing end to end business solution by providing design solutions on technologies to be used.
  • Experience in creating reusable transformations and complex mappings
  • Experience in versioning and managing/deployment groups in Informatica environment
  • Well experienced in functional and technical systems analysis & design, systems architectural design, presentation, process interfaces design, process data flow design, and system impact analysis and design documentation and presentation.
  • Experience with dimensional modelling with both star and snow flake schema using Ralf Kimball methodology .
  • Proficiency in SQL in both relational and star-schema environments
  • Proven experience in project and team leading with zero defect delivery. Equally comfortable working independently as well as in a team environment
  • Experience of handling large scale projects.
  • Expertise in working with Teradata and using TPT
  • Experience on Teradata utilities fastload, multiload, tpump to load data
  • Proficiency in writing Teradata queries
  • Experience on working with native TPT and BTEQ on Unix/Linux
  • Excellent experience on implementing slowly changing dimensions - Type I, II & III
  • Demonstrated ability to lead projects from planning through completion under fast paced and time sensitive environments
  • Excellent knowledge of planning, estimation, project coordination and leadership in managing large scale projects.
  • Good interpersonal communicator who effectively interacts with clients and customers
  • Decisive, energetic and focused team lead/player who takes ownership and responsibility for requirements and contributes positively to the team.
  • Thorough understanding of Software Development Life Cycle (SDLC) including requirements analysis, system analysis, design, development, documentation, training, implementation and post-implementation review.

TECHNICAL SKILLS:

ETL Tools:: Informatica 10.1/9.6.1, (PowerCenter/PowerMart) (Designer, Workflow Manager, Workflow Monitor, Server Manager, Power Connect), Talend, IDQ, MDM, TOS, TIS.

Databases: Oracle 12c/11g/10g, MS SQL Server 2012/2008/2005, MS Access and DB2, TeradataV2R5/V2R6, V12, V14, V15, Netezza.

Cloud: AWS Redshift, Ec2, S3

Teradata Tools & Utilities: Query Facilities: SQL Assistant, BTEQ Load & Export: Fastload, Multiload, Tpump, TPT, Fast Export, DataMover.

Languages: SQL, PL/SQL, C, C++, Shell scripting (K-shell, C-Shell), Unix Shell Script.

Packages: Microsoft Office 2010, Microsoft Project 2010, SAP and Microsoft Visio, Share point Portal Server 2003/2007.

Web Technologies: HTML, XML, Java Script

Tools: SQL plus, PL/SQL Developer, Toad, SQL* Loader, Active Batch

Operating systems: Windows8/7,UNIX,MSDOS and LINUX

PROFESSIONAL EXPERIENCE:

Confidential - Arlington, VA

Sr. ETL Developer

Responsibilities:

  • Resolved complex technical and functional issues/bugs identified during implementation, testing and post production phases.
  • Identified & documented data integration issues and other data quality issues like duplicate data, non-conformed data, and unclean data.
  • Extraction, Transformation and Loading (ETL) of data by using Informatica Power Center.
  • Worked on Agile methodology with team of ETL Developers, Database developers and testers.
  • Copied data from multiple, evenly sized files using ETL utilities over AWS Redshift.
  • Assisted with the creation of reusable components as they relate to the ETL framework.
  • Conducted ETL design reviews with the Data Architects and DBAs to ensure that code meets performance requirements.
  • Worked on Informatica Designer Tool's components - Source Analyzer, Transformation Developer, and Mapping Designer.
  • Monitored daily ETL health using diagnostic queries.
  • Assisted team members in functional and Integration testing.
  • Involved in requirements gathering, data modeling and designed Technical, Functional & ETL Design documents.
  • Designed and implemented slowly changing dimension mappings to maintain history.
  • Created PL/SQL stored procedures and various functions using Oracle.
  • Used collections, performance tuned stored procedures and utilized Bulk processing in PL/SQL stored procedures.
  • Implemented Type 2 slowly changing dimensions to maintain dimension history and tuned the mappings for optimum performance.
  • Created and worked with generic stored procedures for various purposes like truncate data from stage tables, insert a record into the control table, generate parameter files etc.
  • Designed and developed Staging and Error tables to identify and isolate duplicates and unusable data from source systems.
  • Designed and developed ETL processes that load data for the data warehouse using ETL tools and PL/SQL.
  • Simplified the development and maintenance of ETL by creating Mapplets, Re-usable Transformations to prevent redundancy.
  • Developed standard mappings and reusable Mapplets using various transformations like expression, aggregator, joiner, source qualifier, router, lookup and filter.
  • Wrote and implemented generic UNIX and FTP Scripts for various purposes like running workflows, archiving files, to execute SQL commands and procedures, move inbound/outbound files.
  • Maintained existing ETL Oracle PL/SQL procedures, packages, and Unix scripts.
  • Designed data warehouse schema and star schema data models using Kimball methodology.
  • Designed and executed test scripts to validate end-to-end business scenarios.
  • Used session partitions, dynamic cache memory, and index cache to improve the performance of ETL jobs.
  • Designed and developed reporting end user layers (Business Objects Universes) and Business Objects reports.
  • Automated and scheduled Workflows, UNIX scripts and other jobs for the daily, weekly, monthly data loads using Autosys Scheduler.

Environment: Informatica PowerCenter10.1, Power Exchange, Oracle 12c, PL/SQL, Business Objects XI R2, Teradata 15, Erwin 9.7, Autosys, DB2, UNIX.

Confidential - Lowell, AR

Sr. ETL/Informatica Developer

Responsibilities:

  • Worked with various transformations such as Expression, Aggregator, Update Strategy, Look Up, Filter, Router, Joiner and Sequence generator in Informatica for new requirement.
  • Part of SDLC (Software Development Life Cycle) Requirements, Analysis, Design, Testing, Deployment of Informatica Power Center.
  • Created a queue dedicated to ETL processes. Configured this queue with a small number of slots (5 or fewer) using Amazon Redshift, designed analytics queries, rather than transaction processing. The cost of COMMIT is relatively high, and excessive use of COMMIT can result in queries waiting for access to the commit queue.
  • Implemented ETL as a commit-intensive process, having a separate queue with a small number of slots to mitigate this issue.
  • Involved in development of Informatica mappings and also tuned for better performance
  • Incident resolution using ALM system, and production support, handling production failures and fix them within SLA.
  • Created/modifying Informatica Workflows and Mappings (power center and Cloud) also involved in unit testing, internal quality analysis procedures and reviews.
  • Validated and fine-tuned the ETL logic coded into existing Power Center Mappings, leading to improved performance.
  • Loaded data in bulk ETL using AWS Redshift.
  • Part of Informatica cloud integration with Amazon Redshift
  • Managing Informatica Cloud based tools including Address Dr.
  • Integrated data from various source systems like SAP (ABAP), SQL Server (DTS), Ariba, Oracle (ODBC) so on.
  • Wrote basic UNIX shell scripts and PL/SQL packages and procedures.
  • Involved in performance tuning of the mappings, session and SQL queries.
  • Creating/modifying Informatica Workflows and Mappings.
  • Used different control flow control like for each loop container, sequence container, execute SQL task, send email task.
  • Used UNLOAD to extract large result sets.
  • Used event handing to send e-mail on error events at the time of transformation
  • Used login feature for analysis purpose
  • Database and Log Backup, Restoration, Backup Strategies, Scheduling Backups.
  • Improved the performance of the SQL server queries using query plan, covering index, indexed views and by rebuilding and reorganizing the indexes.
  • Performed tuning of SQL queries and stored procedures using SQL Profiler and Index Tuning Wizard.
  • Used Amazon Redshift Spectrum for ad hoc ETL processing.
  • Troubleshooting performance issues and fine-tuning queries and stored procedures.
  • Defined Indexes, Views, Constraints and Triggers to implement business rules.
  • Involved in Writing Complex T-SQL Queries.
  • Backing up master & system databases and restoring them.
  • Developed Stored Procedures and Functions to implement necessary business logic for interface and reports.
  • Involved in testing and debugging stored procedures.
  • Wrote the DAX statements for Cube
  • Designed Visualizations in Tableau.

Environment: Informatica Power center 9.x/8.x, Oracle 11/10g, PL/SQL, UNIX shell scripting, SQL Server, Visual Studio 2008, SSIS, SSRS.

Confidential - Bellevue, WA

ETL/ Informatica Developer

Responsibilities:

  • Expertise in UNIX shell scripting, FTP, SFTP and file management in various UNIX environments.
  • Used Informatica Power Center for Extraction, Transformation and Loading data from heterogeneous source systems into the target data base.
  • Experience working with Business users and architects to understand the requirements and to pace up the process in meeting the milestone.
  • Perform impact analysis of the existing process of RRP-variant and designing the solution for automating input and output processes to/from RRP.
  • Designed all the above projects and frameworks and lead the development teams to reach the project timelines within the budget.
  • Designed and developed the Informatica processes to send the data to retail web services (RWS) and capturing the response.
  • Scheduled various daily and monthly ETL loads using Control-M.
  • Working with global support technology relationship team to setup VDIs with the required software packages for ETL development for offshore team. On boarding the offshore team with training.
  • Identified and eliminated duplicates in datasets thorough IDQ components of Edit Distance, Jaro Distance and Mixed Field matcher, It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
  • Effectively communicates with other technology and product team members.
  • Work on the performance improved areas. Debug the issues and coming up with the proper solutions which will reduce the process times.
  • Informatica Data Quality (IDQ) is the tool used here for data quality measurement.
  • Created mappings by cleansing the data and populate that into Staging tables, populating the staging to Archive and then to Enterprise Data Warehouse by transforming the data into business needs & Populating the Data Mart with only required information.
  • Extensively used Informatica Power Centre tools and transformations such as Lookups, Aggregator, Joiner, Ranking, Update Strategy, Mapplet, connected and unconnected stored procedures / functions / SQL overrides usage in Lookups / Source filter usage in Source qualifiers.
  • Created Pre/Post Session and SQL commands in sessions and mappings on the target instance.
  • Responsible for Performance tuning at various levels during the development.
  • Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Extracted/loaded data from/into diverse source/target systems like SQL server, XML and Flat Files.
  • Used stored procedures to create a standard Time dimension, drop and create indexes before and after loading data into the targets for better performance.
  • Managed post production issues and delivered all assignments/projects within specified time lines.
  • Extensive use of Persistent cache to reduce session processing time.

Environment: Informatica Power center 9.x/8.x, Oracle, SQL server 2008, LINUX, IDQ, LSF(Job scheduling), PL/SQL, UNIX shell scripting

Confidential - Chicago, IL

ETL/Informatica Developer

Responsibilities:

  • Involved in Analysis, Design and Development, test and implementation of Informatica transformations and workflows for extracting the data from the multiple systems.
  • Worked cooperatively with the team members to identify and resolve various issues relating to Informatica.
  • Worked on Dimension as well as Fact tables, developed mappings and loaded data on to the relational database.
  • Developed mappings in multiple schema data bases to load the incremental data load into dimensions.
  • Experience in Integration of various data sources like Oracle, SQL Server, Fixed Width and Delimited Flat Files & XML Files.
  • Worked extensively on transformations like Filter, Router, Aggregator, Lookup, Update Strategy, Stored Procedure, Sequence Generator and Joiner transformations.
  • Created, launched & scheduled sessions and batches using Power Center Server Manager.
  • Developed shell scripts, PL/SQL procedures, for creating/dropping of table and indexes of performance for pre and post session management.
  • Responsible for migrating the workflows from development to production environment.
  • Worked using Parameter Files, Mapping Variables, and Mapping Parameters for Incremental loading.
  • Used Connected and Unconnected lookups in various mappings.
  • Scheduled jobs using tools such as Tivoli Workload Scheduler (TWS).
  • Reviewed and documented existing SQL*Loader ETL scripts.
  • Analyzed the session and error logs for troubleshooting mappings and sessions.
  • Worked on SQL tools like TOAD to run SQL Queries to validate the data.

Environment: Informatica Power Center 9.1, PL/SQL, Oracle 10g, Teradata, SQL Server 2008/12, Windows 2000, UNIX, Shell Scripting, Oracle PL/SQL, TOAD 11.0.

Confidential

Informatica Developer

Responsibilities:

  • Actively participated in understanding business requirements, analysis and designing ETL process.
  • Effectively applied all the business requirements and transforming the business rules into mappings.
  • Developed Mappings between source systems and Warehouse components.
  • Used Informatica designer to create complex mappings using different transformations to move data to a Data Warehouse.
  • Developed extract logic mappings and configured sessions.
  • Extensively used the Filter Control & Expression on Source data base for filter out the invalid data etc.
  • Extensively used ETL to load data from Flat files which involved both fixed width as well as Delimited files and also from the relational database, which was Oracle.
  • Worked on Debugging, Troubleshooting and documentation of the Data Warehouse.
  • Created reusable transformations and Mapplets to use in multiple mappings.
  • Handled the performance tuning of Informatica mappings.
  • Developed Shell Scripts as per requirement.
  • Prepared PL/SQL scripts for data loading into Warehouse and Mart.
  • Fixed SQL errors within the deadline.
  • Making appropriate changes to schedules when some jobs are delayed.
  • Self-Review of Unit test cases, Integration test cases of all the assigned modules.

Environment: Informatica Power Center 8.6, Windows XP, Oracle 10g, UNIX/LINUX, SQL Server.

We'd love your feedback!