We provide IT Staff Augmentation Services!

Sr. Etl Developer Resume

5.00/5 (Submit Your Rating)

MarylanD

SUMMARY

  • 9+years of IT experience with expertise in analysis, design, development and implementation of Data warehouses, data marts and Decision Support Systems (DSS) using ETL tools with RDBMS like Oracle, MS SQL server, Teradata, DB2 and Snowflake databases on Windows and UNIX platforms.
  • Around 8 years of strong experience in Data Warehousing and ETL using Informatica PowerCenter 10.2.1/9.6.5.1 , Power Exchange 10.1/9.6.5.1 , Oracle 12c/11g/10g/9i, Teradata 14/13/12 and Erwin.
  • Strong experience with Informatica tools using real - time CDC (change data capture) and MD5.
  • Experience in integration of various data sources like SAP, Oracle, Teradata, Mainframes, SQL server, XML, Flat files, JSON and extensive experience on Oracle, Teradata and Snowflake.
  • Experience in working with Informatica Intelligent Cloud Services (IICS)
  • Extensively involved in the development of DataStage ETL Parallel jobs for extracting data from different data sources, data transformation and loading the data into data warehouse for data mart operations.
  • Experience with Snowflake cloud Data Warehouse and AWS S3 bucket for integrating data from multiple source system.
  • Developed the audit activity for all the Informatica IICS (cloud) mappings.
  • Automated/Scheduled the Informatica IICS (cloud) jobs to run daily with email notifications for any failures.
  • 3.5+year of exp in Informatica Cloud based technologies like ICS, ICRT, IICS.
  • Extensive exposure in overall SDLC including requirement gathering, development, testing, debugging, deployment, documentation, production support.
  • 6+ year of Experience in using IDQ tool 10.0/9.6 for profiling, applying rules and develop mappings to move data from source to target systems.
  • Experience in IDQ development around data profiling, cleansing, parsing, standardization, verification, matching and data quality exception monitoring and handling.
  • Very strong in Data Warehousing Concepts like Dimensions Type me, II and III, Facts, Surrogate keys, ODS, staging area, cube also well versed in Ralph Kimball and Bill Inmon Methodologies.
  • Experience on data modeling and create LDP and PDM for Star schema and Snowflake schema using MS Visio and ERWIN 7.1/4.5.
  • Expert in T-SQL coding and testing: functions, views, triggers, cursors, dictionary, stored procedures etc.
  • Assisted the other ETL developers in solving complex scenarios and coordinated with source systems owners with day-to-day ETL progress monitoring.
  • Expert in writing complex SQL queries, PL/SQL and optimizing the queries in Oracle, SQL Server and Teradata also excellent in working with Views, Synonyms, Indexes, Partitioning, Database Joins, stored procedure, Stats and Optimization.
  • Expertise in Teradata utilities like Multi Load, Fast Load, Fast Export, BTEQ, TPUMP, TPT and tools like SQL Assistant, Viewpoint
  • Skilled in creating & maintaining ETL Specification Documents, Use Cases, Source to target mapping, Requirement Traceability Matrix, performing Impact Assessment and providing Effort estimates, deployment artifacts
  • Deftly executed multi-resource projects following Onsite Offshore model while serving as a Mentor for the Junior Team Members

TECHNICAL SKILLS

Data Warehousing/ ETL Technology: Informatica PowerCenter 10.1/9.6.5.1 /8.6, Informatica PowerExchange 10.1/9.6.5, Informatica Data Quality 10.0/9.6, IICS R29

Database: Oracle 12c/11g/10g, IBM UD2 DB2, MS SQL Server, MS Access 2000, Teradata 15/14/13, Snowflake cloud data warehouse

Data modeling: Erwin 9.1/7.1

Languages: SQL, PL/SQL, XSD, XML, Unix shell scripting

Tools: Microsoft Visio, TOAD, Oracle SQL developer, WINSQL, WINSCP, Secure Shell Client. TOAD, OBIEE 10g, SQL Loader, MS Office, Smart FTP, Ultra Edit, Autosys, Control-M, HP Quality Center, MS Visio, Autosys

Operating System: Windows, UNIX

Reports: Cognos 10.0 /9.0, QlikView, Tableau 10.0/9

Methodologies: Ralph Kimball’s Star Schema and Snowflake Schema.

Mythologies: SDLC, Agile

Others: MS Word, MS Access, MS Office, GitHub

PROFESSIONAL EXPERIENCE

Confidential, Maryland

Sr. ETL Developer

Responsibilities:

  • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment. experience in IBM DataStage (11.7 and under ),SQL, Quality Stage, Information Analyzer, Cognos (8.x,11.x) Framework Manager Modeling Packages, MOTIO and Complex Report development.
  • Actively involved in interacting with business users to record user requirements and Business Analysis.
  • Involved in ETL migration project, Conversion of ETL Online to IBM Info Sphere DataStage.
  • Translated requirements into business rules & made recommendations for innovative IT solutions.
  • Convert specifications to programs and data mapping in an ETL Informatica IICS(Cloud) environment.
  • Worked with PowerCenter Designer tools in developing mappings and Mapplets to extract and load the data from flat files and Oracle database.
  • Extensively worked on performance tuning of Informatica and IDQ mappings.
  • Created Informatica workflows and IDQ mappings.
  • Created high level and low-level design documents for pentaho ETL jobs.
  • Developing Informatica Cloud (IICS) Jobs to migrate data from legacy Teradata Data Warehouse (EDW) to Snowflake Cloud.
  • Created mapping to load data into AWS S3 bucket using Informatica S3 connector also populated data into Snowflake from S3 bucket using complex SQL query.
  • Designed and Developed ETL jobs using pentaho to load the insurance coverage and liability details.
  • Loaded diverse types (Structured, JSON,XML, flat files etc.) into the Snowflake cloud data warehouse.
  • Imported data from HDFS and load it to Oracle on regular basis.
  • Worked onInformaticaPowerCenter tool - Source Analyzer, Data Warehousing Designer, Mapping Designer & Mapplets, and Transformations.
  • Experience in IICS Application Integration components like Processes, Service Connectors, and Process Object
  • Worked with various complex mapping, designed slowly changing dimension Type1 and Type2.
  • Performance tuning of the process at the mapping level, session level, source level, and the target level.
  • Experience with tuning DataStage jobs using various techniques by understanding its complexion during volume testing before moving to production.
  • Resolving the tickets based on the priority levels raised by QA team.
  • Developed Parameter files for passing values to the mappings for each type of clients.
  • Worked on informatica integration cloud software( IICS) Application Integration components like Processes, Service Connectors, and Process Objects to integrate into another application and with CIH,CDI,CAI.
  • Applied CI/CD principles to query development and integration with existing data flow set-up
  • Assisted the other ETL developers in solving complex scenarios and coordinated with source systems owners with day to day ETL progress monitoring
  • Understanding the entire functionality and major algorithms of the project and adhering to the company testing process.

Environment: InformaticaPowerExchange 10.1, IICS R29,Informatica Data Quality (IDQ) 10.0, Tableau10.0, Linux, SQL, PL/SQL, TOAD, Snowflake Cloud Data Warehouse,Kafka, Teradata 15, AWS S3 Bucket, SQL Server 2012, ORACLE 12c, Control M, Shell Scripting, JSON, SQL Loader

Confidential, New York

ETL/Informatica IICS Developer

Responsibilities:

  • Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
  • Developed the audit activity for all the Informatica IICS (cloud) mappings.
  • Automated/Scheduled the Informatica IICS (cloud) jobs to run daily with email notifications for any failures.
  • Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.
  • Developing Informatica Cloud (IICS) Jobs to migrate data from legacy Teradata Data Warehouse (EDW) to Snowflake Cloud
  • Worked on converting specifications to programs and data mapping in an ETL Informatica Intelligent Cloud Services (IICS) environment.
  • Imported data from HDFS and load it to Oracle on regular basis.
  • Develop and test ETL scripts to load the historical data to seed the data mart tables for testing, dry runs and production migrations from ETL DataStage version 9.1 to DS 11.7.
  • Created and scheduled DataStage job sequencer jobs based on load sequence.
  • Work with data architect and developers to design efficient ETL jobs using intermediate tables
  • Performance tuning of PL/SQL and ETL jobs
  • Mappings, Sessions, Workflows from Development to Test and then to UAT environment.
  • Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.
  • Experience with Snowflake cloud Data Warehouse and AWS S3 bucket for integrating data from multiple source system.
  • Worked with batch operations to do control M, Tidal and/or Autosys (third party) schedulers changes in order to schedule DataStage jobs as per requirements.
  • Extensive experience in integration of Informatica Data Quality (IDQ) with Informatica PowerCenter.
  • Imported various heterogeneous files using Informatica Power Center 9.x Source Analyzer.
  • Coordinated with Autosys team to run Informatica jobs for loading historical data in production.
  • Documented Data Mappings/ Transformations as per the business requirement.
  • Created XML, Autosys for the developed workflows.
  • Migration of code from development to Test and upon validation to Pre-Production and Production environments.
  • Provided technical assistance to business program users, and developed programs for business and technical applications.
Environment: Informatica PowerCenter 10.1/9.6, Informatica PowerExchange 10.1/9.6, IICS, Data Quality 9.6, SQL Server 2008, Shell Scripts, Teradata 14, SQL, Oracle 11g PL/SQL, UNIX, Toad, SQL Developer, HP Quality Center, Cognos 9, T-SQL.

Confidential, Tampa, FL

ETL Informatica Developer

Responsibilities:

  • Worked with the business team to gather requirements for projects and created strategies to handle the requirements.
  • Worked on project documentation which included the Functional, Technical and ETL Specification documents.
  • Designed the ETL processes using Informatica PowerCenter to load data from Mainframe DB2, Oracle, SQL Server, Flat Files, XML Files and Excel files to target Teradata warehouse database.
  • Involved in end-to-end system testing for ETL pentaho jobs.
  • Created transition design documents and provided ETL noledge transfer to charter company.
  • Developed data replication and ETL jobs using Oracle database.
  • Worked on Development, Testing and Implementation of ETL processes using Informatica Intelligent Cloud Services (IICS)
  • Involved in designing the star schema and populating the fact table and associated dimension tables.
  • Extensively worked on complex mappings which involved slowly changing dimensions.
  • Developed several complex mappings in Informatica a variety of PowerCenter transformations, Mapping
  • Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using both the Informatica PowerCenter and IDQ.
  • Worked on developing Change Data Capture (CDC) mechanism using Informatica PowerExchange for some of the interfaces based on the requirements and limitations of the Project.
  • Implemented performance and query tuning on all the objects of Informatica using SQL Developer.
  • Provided 24x7 production support when necessary.

Environment: Informatica PowerCenter 9.6, Informatica PowerExchange 9.6, Oracle 11g, SQL, Erwin 5, UNIX CRONTAB, Control-M, Remedy Incident Tool, Ultra Edit, Teradata 13.

Confidential, Lakeland, FL

Informatica Developer / Data Analyst

Responsibilities:

  • Using Informatica PowerCenter Designer analyzed the source data to Extract & Transform from various source systems (oracle 10g, DB2, SQL server and flat files)by incorporatingbusiness rules using different objects and functions that the tool supports.
  • Implemented slowly changing dimensions (SCD) for some of the Tables as per user requirement.
  • Developed Stored Procedures and used them inStored Procedure transformation for data processing and has used data migration tools
  • Documented Informatica mappings in Excel spread sheet.
  • Tuned the Informatica mappings for optimal load performance.
  • Has usedBTEQ, FEXP, FLOAD, MLOAD Teradatautilities to export and load data to/from Flat files.
  • Created and Configured Workflows and Sessionsto transport the data to target warehouse Oracle tables using Informatica Workflow Manager.
  • Has generated reports usingOBIEE 10.1.3for the future business utilities.
  • Worked along withUNIX team for writingUNIX shell scripts to customize the server scheduling jobs.
  • Constantly interacted with business users to discuss requirements.

Environment: Informatica 9.5.0, Oracle 10g, SQL server 2005, SQL, T-SQL, PL/SQL, Toad, Erwin4.x, Unix, Tortoise SVN, Flat files.

We'd love your feedback!