We provide IT Staff Augmentation Services!

Senior Etl Developer Resume

2.00/5 (Submit Your Rating)

Arlington, VA

SUMMARY

  • Over 8+ years of IT experience on ETL development, Data warehouse & Business intelligence technologies, and design.
  • Excellent working experience & sound knowledge on Informatica and Talend ETL tool. Expertise in reusability, parameterization, workflow design, designing and developing ETL mappings and scripts.
  • Strong proficiency in writing SQL (including Oracle, Teradata, SQL Server, Netezza, DB2). good exposure on optimizing teh SQL and performance tuning.
  • Excellent knowledge on Business Intelligence tools such as SAP Business objects& Microstrategy.
  • Solid understanding of both technical and functional data quality concepts.
  • Excellent experience on Oracle, SQL server, DB2 UDB, Netezza & Teradata.
  • Experience in developing analytics, dashboards, ad - hoc reports, published reports, static reports through SAP BO.
  • Good understand teh ETL specifications and build teh ETL applications like Mappings on daily basis
  • Expertise in UNIX shell scripting
  • Extensive Knowledge of RDBMS concepts, PL/SQL, Stored Procedure and Normal Forms.
  • Expertise in setting up load strategy, dynamically passing teh parameters to mappings and workflows in Informatica & workflows & data flows in SAP Business Objects data services integration tools.
  • Experience in developing end to end business solution by providing design solutions on technologies to be used.
  • Experience in creating reusable transformations and complex mappings
  • Experience in versioning and managing/deployment groups in Informatica environment
  • Well experienced in functional and technical systems analysis & design, systems architectural design, presentation, process interfaces design, process data flow design, and system impact analysis and design documentation and presentation.
  • Experience with dimensional modelling with both star and snow flake schema using Ralf Kimball methodology.
  • Proficiency in SQL in both relational and star-schema environments
  • Proven experience in project and team leading with zero defect delivery. Equally comfortable working independently as well as in a team environment
  • Experience of handling large scale projects.
  • Expertise in working with Teradata and using TPT
  • Experience on Teradata utilities fastload, multiload, tpump to load data
  • Proficiency in writing Teradata queries
  • Experience on working with native TPT and BTEQ on Unix/Linux
  • Excellent experience on implementing slowly changing dimensions - Type I, II & III
  • Demonstrated ability to lead projects from planning through completion under fast paced and time sensitive environments
  • Excellent knowledge of planning, estimation, project coordination and leadership in managing large scale projects.
  • Good interpersonal communicator who effectively interacts with clients and customers
  • Decisive, energetic and focused team lead/player who takes ownership and responsibility for requirements and contributes positively to teh team.
  • Thorough understanding of Software Development Life Cycle (SDLC) including requirements analysis, system analysis, design, development, documentation, training, implementation and post-implementation review.

TECHNICAL SKILLS

ETL Tools: Informatica 10.1/9.6.1, (PowerCenter/PowerMart) (Designer, Workflow Manager, Workflow Monitor, Server Manager, Power Connect), Talend, IDQ, MDM, TOS, TIS.

Databases: Oracle 12c/11g/10g, MS SQL Server 2012/2008/2005 , MS Access and DB2, TeradataV2R5/V2R6, V12, V14, V15, Netezza.

Cloud: AWS Redshift, Ec2, S3

Teradata Tools & Utilities: Query Facilities SQL Assistant, BTEQ Load & Export Fastload, Multiload, Tpump, TPT, Fast Export, DataMover.

Languages: SQL, PL/SQL, C, C++, Shell scripting (K-shell, C-Shell), Unix Shell Script.

Packages: Microsoft Office 2010, Microsoft Project 2010, SAP and Microsoft Visio, Share point Portal Server 2003/2007.

Web Technologies: HTML, XML, Java Script

Tools: SQL plus, PL/SQL Developer, Toad, SQL* Loader, Active Batch

Operating systems: Windows8/7,UNIX,MSDOS and LINUX

PROFESSIONAL EXPERIENCE

Confidential, Arlington, VA

Senior ETL Developer

Responsibilities:

  • Resolved complex technical and functional issues/bugs identified during implementation, testing and post production phases.
  • Identified & documented data integration issues and other data quality issues like duplicate data, non-conformed data, and unclean data.
  • Extraction, Transformation and Loading (ETL) of data by using Informatica Power Center.
  • Worked on Agile methodology with team of ETL Developers, Database developers and testers.
  • Copied data from multiple, evenly sized files using ETL utilities over AWS Redshift.
  • Assisted with teh creation of reusable components as they relate to teh ETL framework.
  • Conducted ETL design reviews with teh Data Architects and DBAs to ensure that code meets performance requirements.
  • Worked on Informatica Designer Tool's components - Source Analyzer, Transformation Developer, and Mapping Designer.
  • Monitored daily ETL health using diagnostic queries.
  • Assisted team members in functional and Integration testing.
  • Involved in requirements gathering, data modeling and designed Technical, Functional & ETL Design documents.
  • Designed and implemented slowly changing dimension mappings to maintain history.
  • Created PL/SQL stored procedures and various functions using Oracle.
  • Used collections, performance tuned stored procedures and utilized Bulk processing in PL/SQL stored procedures.
  • Implemented Type 2 slowly changing dimensions to maintain dimension history and tuned teh mappings for optimum performance.
  • Created and worked with generic stored procedures for various purposes like truncate data from stage tables, insert a record into teh control table, generate parameter files etc.
  • Designed and developed Staging and Error tables to identify and isolate duplicates and unusable data from source systems.
  • Designed and developed ETL processes that load data for teh data warehouse using ETL tools and PL/SQL.
  • Simplified teh development and maintenance of ETL by creating Mapplets, Re-usable Transformations to prevent redundancy.
  • Developed standard mappings and reusable Mapplets using various transformations like expression, aggregator, joiner, source qualifier, router, lookup and filter.
  • Wrote and implemented generic UNIX and FTP Scripts for various purposes like running workflows, archiving files, to execute SQL commands and procedures, move inbound/outbound files.
  • Maintained existing ETL Oracle PL/SQL procedures, packages, and Unix scripts.
  • Designed data warehouse schema and star schema data models using Kimball methodology.
  • Designed and executed test scripts to validate end-to-end business scenarios.
  • Used session partitions, dynamic cache memory, and index cache to improve teh performance of ETL jobs.
  • Designed and developed reporting end user layers (Business Objects Universes) and Business Objects reports.
  • Automated and scheduled Workflows, UNIX scripts and other jobs for teh daily, weekly, monthly data loads using Autosys Scheduler.

Environment: Informatica PowerCenter10.1, Power Exchange, Oracle 12c, PL/SQL, Business Objects XI R2, Teradata 15, Erwin 9.7, Autosys, DB2, UNIX.

Confidential, Lowell, AR

Sr. ETL/Informatica Developer

Responsibilities:

  • Worked with various transformations such as Expression, Aggregator, Update Strategy, Look Up, Filter, Router, Joiner and Sequence generator in Informatica for new requirement.
  • Part of SDLC (Software Development Life Cycle) Requirements, Analysis, Design, Testing, Deployment of Informatica Power Center.
  • Created a queue dedicated to ETL processes. Configured this queue with a small number of slots (5 or TEMPfewer) using Amazon Redshift, designed analytics queries, rather than transaction processing. Teh cost of COMMIT is relatively high, and excessive use of COMMIT can result in queries waiting for access to teh commit queue.
  • Implemented ETL as a commit-intensive process, having a separate queue with a small number of slots to mitigate this issue.
  • Involved in development of Informatica mappings and also tuned for better performance
  • Incident resolution using ALM system, and production support, handling production failures and fix them within SLA.
  • Created/modifying Informatica Workflows and Mappings (power center and Cloud) also involved in unit testing, internal quality analysis procedures and reviews.
  • Validated and fine-tuned teh ETL logic coded into existing Power Center Mappings, leading to improved performance.
  • Loaded data in bulk ETL using AWS Redshift.
  • Part of Informatica cloud integration with Amazon Redshift
  • Managing Informatica Cloud based tools including Address Dr.
  • Integrated data from various source systems like SAP (ABAP), SQL Server (DTS), Ariba, Oracle (ODBC) so on.
  • Wrote basic UNIX shell scripts and PL/SQL packages and procedures.
  • Involved in performance tuning of teh mappings, session and SQL queries.
  • Creating/modifying Informatica Workflows and Mappings.
  • Used different control flow control like for each loop container, sequence container, execute SQL task, send email task.
  • Used UNLOAD to extract large result sets.
  • Used event handing to send e-mail on error events at teh time of transformation
  • Used login feature for analysis purpose
  • Database and Log Backup, Restoration, Backup Strategies, Scheduling Backups.
  • Improved teh performance of teh SQL server queries using query plan, covering index, indexed views and by rebuilding and reorganizing teh indexes.
  • Performed tuning of SQL queries and stored procedures using SQL Profiler and Index Tuning Wizard.
  • Used Amazon Redshift Spectrum for ad hoc ETL processing.
  • Troubleshooting performance issues and fine-tuning queries and stored procedures.
  • Defined Indexes, Views, Constraints and Triggers to implement business rules.
  • Involved in Writing Complex T-SQL Queries.
  • Backing up master & system databases and restoring them.
  • Developed Stored Procedures and Functions to implement necessary business logic for interface and reports.
  • Involved in testing and debugging stored procedures.
  • Wrote teh DAX statements for Cube
  • Designed Visualizations in Tableau.

Environment: Informatica Power center 9.x/8.x, Oracle 11/10g, PL/SQL, UNIX shell scripting, SQL Server, Visual Studio 2008, SSIS, SSRS.

Confidential, Bellevue, WA

ETL/ Informatica Developer

Responsibilities:

  • Expertise in UNIX shell scripting, FTP, SFTP and file management in various UNIX environments.
  • Used Informatica Power Center for Extraction, Transformation and Loading data from heterogeneous source systems into teh target data base.
  • Experience working with Business users and architects to understand teh requirements and to pace up teh process in meeting teh milestone.
  • Perform impact analysis of teh existing process of RRP-variant and designing teh solution for automating input and output processes to/from RRP.
  • Designed all teh above projects and frameworks and lead teh development teams to reach teh project timelines within teh budget.
  • Designed and developed teh Informatica processes to send teh data to retail web services (RWS) and capturing teh response.
  • Scheduled various daily and monthly ETL loads using Control-M.
  • Working with global support technology relationship team to setup VDIs with teh required software packages for ETL development for offshore team. On boarding teh offshore team with training.
  • Identified and eliminated duplicates in datasets thorough IDQ components of Edit Distance, Jaro Distance and Mixed Field matcher, It enables teh creation of a single view of customers, halp control costs associated with mailing lists by preventing multiple pieces of mail.
  • Effectively communicates with other technology and product team members.
  • Work on teh performance improved areas. Debug teh issues and coming up with teh proper solutions which will reduce teh process times.
  • Informatica Data Quality (IDQ) is teh tool used here for data quality measurement.
  • Created mappings by cleansing teh data and populate that intoStagingtables, populating teh staging to Archive and then to Enterprise Data Warehouseby transforming teh data into business needs & Populating theData Martwith only required information.
  • Extensively used Informatica Power Centre tools and transformations such as Lookups, Aggregator, Joiner, Ranking, Update Strategy, Mapplet, connected and unconnected stored procedures / functions / SQL overrides usage in Lookups / Source filter usage in Source qualifiers.
  • Created Pre/Post Session and SQL commands in sessions and mappings on teh target instance.
  • Responsible for Performance tuning at various levels during teh development.
  • Identified performance issues in existing sources, targets and mappings by analyzing teh data flow, evaluating transformations and tuned accordingly for better performance.
  • Extracted/loaded data from/into diverse source/target systems like SQL server, XML and Flat Files.
  • Used stored procedures to create a standard Time dimension, drop and create indexes before and after loading data into teh targets for better performance.
  • Managed post production issues and delivered all assignments/projects within specified time lines.
  • Extensive use of Persistent cache to reduce session processing time.

Environment: Informatica Power center 9.x/8.x, Oracle, SQL server 2008, LINUX, IDQ, LSF(Job scheduling), PL/SQL, UNIX shell scripting

Confidential, Chicago, IL

ETL/Informatica Developer

Responsibilities:

  • Involved in Analysis, Design and Development, test and implementation of Informatica transformations and workflows for extracting teh data from teh multiple systems.
  • Worked cooperatively with teh team members to identify and resolve various issues relating to Informatica.
  • Worked on Dimension as well as Fact tables, developed mappings and loaded data on to teh relational database.
  • Developed mappings in multiple schema data bases to load teh incremental data load into dimensions.
  • Experience in Integration of various data sources like Oracle, SQL Server, Fixed Width and Delimited Flat Files & XML Files.
  • Worked extensively on transformations like Filter, Router, Aggregator, Lookup, Update Strategy, Stored Procedure, Sequence Generator and Joiner transformations.
  • Created, launched & scheduled sessions and batches using Power Center Server Manager.
  • Developed shell scripts, PL/SQL procedures, for creating/dropping of table and indexes of performance for pre and post session management.
  • Responsible for migrating teh workflows from development to production environment.
  • Worked using Parameter Files, Mapping Variables, and Mapping Parameters for Incremental loading.
  • Used Connected and Unconnected lookups in various mappings.
  • Scheduled jobs using tools such as Tivoli Workload Scheduler (TWS).
  • Reviewed and documented existing SQL*Loader ETL scripts.
  • Analyzed teh session and error logs for troubleshooting mappings and sessions.
  • Worked on SQL tools like TOAD to run SQL Queries to validate teh data.

Environment: Informatica Power Center 9.1, PL/SQL, Oracle 10g, Teradata, SQL Server 2008/12, Windows 2000, UNIX, Shell Scripting, Oracle PL/SQL, TOAD 11.0.

We'd love your feedback!