We provide IT Staff Augmentation Services!

Etl Developer Resume

4.00/5 (Submit Your Rating)

Coventry, RI

PROFESSIONAL SUMMARY:

  • 8 years of focused experience in Information Technology with a strong background in developing BI applications in various verticals such as Health, Insurance, Finance and Pharmacy.
  • Experience in Analysis, Design, and Development of various business applications in different platforms in data warehousing using Informatica 9.x/8.x/7.1, Oracle, Sybase, Teradata, and SQL server.
  • Worked on creating ipaas (Integration Platform as a Service) layer using Informatica Cloud real time components (SPI) with cloud based RMS(one).
  • Extensive experience in using Informaticatool for implementation of ETL methodology in Data Extraction, Transformation and Loading.
  • Used Informatica Power Center 9.X to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and flat files.
  • Expertise in Data Warehousing, Data Migration, Data Integration using Business Intelligence (BI) tools such as Informatica Power Center, B2B Data Transformation, Informatica Data Quality, MDM, SSIS, OBIEE, etc.,
  • Experience in Business Intelligence applications design, development and Maintenance of Integration Services (SSIS), SQL Server Reporting Services.
  • Hands on Experience in Installing and Configuring Informatica Secure Agent (Informatica cloud).
  • Extensive knowledge with Relational &dimensional data modeling, star schema/snowflakes schema, fact and dimension tables and process mapping using the top - down and bottom-up approach.
  • Experience in using Informatica Client Tools - Designer, Source Analyzer, Target Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Manager and Workflow Monitor, Analyst, Developer tools.
  • Developed various mappings using different transformations like Source Qualifier, Expression, Filter, Joiner, Router, Union, Unconnected / Connected Lookups and Aggregator.
  • Experience in creating High Level Design and Detailed Design in the Design phase.
  • Experience in integration of various data sources like Oracle, DB2, MS SQL Server and Flat Files.
  • Experience in packages to extract, transform and load data (ETL) using SSIS, designed packages which are utilized for tasks and transformations Data Conversion and Pivot tables .
  • Extensive experience in Requirements gathering, Analysis, Impact analysis, Design, Development and Quality Assurance, Quality Control , Mainframe testing (Unit testing, System testing, Integration testing and supported UAT), QA Testing Implementation, Project management, defect tracking , causal analysis for the defect, reporting project status reports.
  • Hands-on experience in SAS Base, SAS Macros, UNIX, SQL and Teradata, working with business users daily in solving their queries on technical end.
  • Generated Surrogate Keys for composite attributes while loading the data into dimension table using Surrogate key generator.
  • Experience in identifying Bottlenecks in ETL Processes and Performance tuning of the production applications using Database Tuning, Partitioning, Index Usage, Aggregate Tables, Session partitioning, Load strategies, commit intervals and transformation tuning.
  • Supported MDM initiatives using Oracle databases and Informatica Data Quality (IDQ).
  • Experience in SQL and PL/SQL - Stored Procedures, Triggers, Packages and Functions .
  • Extracted Data from multiple operational sources of loading staging area, Data Warehouse and data marts using CDC/ SCD (Type1/Type2) loads.
  • Extensively worked on Dimensional Modeling, Data Migration, Data Cleansing , and Data staging for operational sources using ETL and data mining features for data warehouses .
  • Extensively involved in writing theUNIX shell programming.
  • Experience in creating Reusable Tasks (Sessions, Command, Email) and Non-Reusable Tasks (Decision, Event Wait, Event Raise, Timer, Assignment, Worklet, Control).
  • Coordinated with offshore, onsite, QA, scheduling, change management, business, and inter-dependent teams.
  • Working knowledge of test management Tool HP ALM QC.
  • Implemented performance-tuning techniques at application, database and system levels.
  • Skilled in developing Test Plans, Creating and Executing Test Cases.
  • Experience working in Scrum & agile methodology and ability to manage change effectively.
  • Excellent communication, interpersonal skill and quickly assimilate latest technologies concepts and ideas.
  • Create, maintain and monitor TIDAL jobs for QNXT and Unix, SQL, FTP and SFTP operations; respond to requests and escalate when necessary to the appropriate support group
  • Create and debug TIDAL jobs on a weekly basis to automate new processes in UNIX, SQL, FTP and SFTP based on business requests or internal Information Technology (IT) requests.
  • Worked with various scheduling solutions including Control-M, Autosys, Tidal, Enterprise scheduler.
  • Involved in migrating physical Linux/Windows servers to cloud (AWS) and testing.

TECHNICAL SKILLS:

ETL Tools/Data Modeling tools: Informatica Power Center, Power Exchange 9.5/9.1/8.6/8.1/7.1 , IDQ, Informatica Cloud, MSBI. (Repository Manager, Designer, Server manager, Work Flow Monitor, Work Flow Manager), Erwin, FACTS and Dimension Tables, Physical and Logical Data Star Join Schema Modeling.

Databases: Oracle 12c/11g/10g/9i, Teradata, MS SQL Server, MS Access, SQL, PL/SQL

Tools: Toad, SQL developer, Visio, Teradata SQL Assistant

SQL, PL/SQL, T: SQL, UNIX, Shell Scripting, Batch Scripting

Operating Systems: UNIX, Windows Server 2008/2003, LINUX.

Job Scheduling: Informatica Scheduler, Tidal Enterprise Scheduler, Control M, CA Autosys

PROFESSIONAL EXPERIENCE:

Confidential, Coventry, RI

ETL Developer

Responsibilities:

  • Extensively used SSIS Import/Export Wizard , for performing the ETL operations.
  • Design and develop ETL infrastructures for over 35 instances of Salesforce organizations
  • Develop mappings, mapping configurations , data synchronization and data replication tasks with Informatica Cloud Services.
  • Developed the audit activity for all the cloud mappings.
  • Automated/Scheduled the cloud jobs to run daily with email notifications for any failures.
  • Generated automated stats with the staging loads comparing with the present day counts to the previous day counts.
  • Performed data loading from source to target using Informatica Cloud.
  • Maintain daily backups of all Salesforce organization in SQL Server 2012.
  • Provide debugging and troubleshooting support for SQL stored procedures , PowerShell scripts , Salesforce workflows , triggers and Visualforce components .
  • Develop Visual Basic applications to automate manual procedures.
  • Create validation queries to support quality control procedures that ensure data integrity in Data Warehouse.
  • Design, build and support methods to manage numerous data integration and reporting tasks. Tools include Salesforce, Informatica, MSSQL, Python.
  • Drive organization towards meta data driven ETL procedures using automated and self-auditing scripts and the referenced tools, replacing human driven data duplication using Explorer, Access and Excel .
  • Successfully implemented the POC (Informatica cloud Migration to SSIS).

Environment: Informatica Cloud Real Time (ICRT), Salesforce, SQL Server 2012 Enterprise Edition, SSIS, T-SQL, Windows Server 2012.

Confidential, Houston, TX

ETL Developer

Responsibilities:

  • Extensively used SSIS Import/Export Wizard, for performing the ETL operations.
  • Responsible for developing, support and maintenance for the ETL ( Extract, Transform and Load) processes using Informatica Power Center.
  • Build the Dimension & Facts tables load process and reporting process using Informatica
  • Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema, Snowflake Schema.
  • Involved in data integration administration, monitoring, auditing using Informatica Cloud Designer .
  • Experience building and managing integrations with Integration Cloud Platform - Integration Platform-as-a-Service ( iPaaS ).
  • Utilized the wizard based Informatica Cloud Services environment like Sandbox to develop, share and test tasks before migrating them to production. Involved in importing the existing Powercenter workflows as Informatica Cloud Service tasks by utilizing Informatica Cloud Integration.
  • Worked on Data Synchronization and Data Replication in Informatica cloud .
  • Worked with Informatica cloud to integrate Salesforce and load the data from Salesforce to Oracle db.
  • Extracted data from various data sources such as Oracle , SQL Server, Flat files and transformed and loaded into targets using Informatica .
  • Created Mappings and used transformations like Source Qualifier, Filter, Update Strategy, Lookup, Expression, Router, Joiner, Normalizer, Aggregator Sequence Generator and Address validator.
  • Developed mappings to load Fact and Dimension tables , SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
  • Worked with Informatica Data Quality 9.6 ( IDQ ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 9.6.
  • Enhance PLSQL procedures for bulk loading from source systems such as Teradata, SAP.
  • Identified and eliminated duplicates in datasets thorough IDQ 9.6 components of Edit Distance, Jaro Distance and Mixed Field matcher , It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
  • Formatted/Generated Data to support SAP Business Objects Interface for Users to access data from Data Warehouse
  • Involved in the Performance Tuning of Database and Informatica. Improved performance by identifying and rectifying the performance bottle necks.
  • ETL experience in using SAP BODS, Cognos Data Manager, Informatica, and SSIS.
  • Worked with SAP and Oracle sources to process the data.
  • Experience with Job Alerts, Job Classes, Queues, Job Actions, Variables, Calendars, File Events, and email on-demand jobs are some features deployed in our TIDAL environment.
  • Used Power BI Power Pivot to develop data analysis prototype, and used Power View and PowerMap to visualize reports.
  • Created Groups, content packs, Enterprise gateway in Power Bi Pro.
  • Designed Data Model in Power bi such Fact and dimensional tables and relationships.
  • Worked with T-SQL to create Tables, Views, and triggers and stored Procedures.
  • Used indexes to enhance the performance of individual queries and enhance the stored Procedures .
  • I helped our customers with different refresh issues in the Power BI online service.
  • Responsible for creating SQL datasets for Power BI and Ad-hoc Reports.
  • Created, Maintained & scheduled various reports in Power BI like Tabular Reports, Matrix Reports .
  • Experience in configuring and deploying SSRS

Environment: Informatica Cloud, PowerCenter, SQL Server 2005 Enterprise Edition, Windows Server 2003, Oracle 11g, Power BI

Confidential, Malvern, PA

Lead ETL Developer

Responsibilities:

  • Participated in requirement meetings with Business Analysts to understand analytical content needs.
  • Assisted in documenting these requirements, resolving ambiguities and conflicts, and ensuring requirements are complete.
  • Works directly with Client Operations, Client Tech team to understand business scope. Does Requirement Analysis, converts business requirements into technical terms. Works with offshore team for coding, unit testing.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center. Created and Configured Workflows, Worklets, and Sessions to transport the data to target warehouse Netezza tables using Informatica Workflow Manager.
  • Worked on SAP for data migration Human Resources and Finance and converted various objects on Organizational Structure, Addresses, Time, Basic Pay, Bank Details, Recurring Payments, Tax assignment, Insurance Plans, Payroll etc., to generate report from SAP BI system.
  • Used Informatica Cloud Customer 360 to cleanse and standardize data from Salesforce.
  • Extensive ETL experience using DTS/SSIS for Data Extractions, Transformations and Loads.
  • Design database table structures for transactional and reference data sources.
  • Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema, Snowflake Schema.
  • Initiate the data modeling sessions, to design and build/append appropriate data mart models to support the reporting needs of applications.
  • Extracted data from various data sources such as Oracle, SQL Server, Flat files, transformed, and loaded into targets using Informatica.
  • Proficiency in Single Sign On (SSO) configuration on Informatica Cloud.
  • Created source and Target Definitions, Reusable transformations, mapplets and worklets.
  • Created Mappings and used transformations like Source Qualifier, Filter, Update Strategy, Lookup, Expression, Router, Joiner, Normalizer, Aggregator Sequence Generator and Address validator.
  • Integrated Swagger apidocs to all existing APIs in transformations.
  • Learned to use Informatica cloud real-time for Salesforce Managed package, which provides out of box integration with salesforce and creation of salesforce specific guide.
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
  • Creating SSIS packages that involves migration from legacy systems to centralized Database.
  • Designed and tested packages to extract, transform and load data (ETL) using SSIS.
  • Designed packages which are utilized for tasks and transformations such as Execute SQL Task, Mapping the right and required data from source to destination, Data Flow Task, Data Conversion, Foreach Loop Container.
  • Expert in using the work flow canvas for multiple transformations of data and the use of different features like variables, package configurations, event handlers, error logging and check points to facilitate various complex ETL logics and specifications.
  • Implemented Logging and Event Handlers on SSIS packages for trouble shooting .Net application.
  • Defined system trust and validation rules for base object columns.
  • Performed Data Steward Activities like manual merge/unmerge and uploading data.
  • Implementing auto merge and manual merge for best version of Truth.
  • Created a DEV/QA/Test environment and offered training to IT employees to learn how to build jobs, understand best practices and utilize Tidal's futures in their Development.
  • Experience with Job Alerts, Job Classes, Queues, Job Actions, Variables, Calendars, File Events, and email on-demand jobs are some features deployed in our TIDAL environment.

Environment: Informatica Power Center 9.5/9.1, IDQ, SQL Server, UNIX, Toad, SQL Developer, SSIS, Putty, SFTP/FTP

Confidential, Dallas, TX

Sr. ETL Developer

Responsibilities:

  • Interacted with the users and making changes to Informatica mappings according to the Business requirements.
  • Experienced in Database programming for Data Warehouses (Schemas), proficient in dimensional modeling ( Star Schema modeling, and Snowflake modeling)
  • Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers and data flow management into multiple targets using Router.
  • Worked on designing, creating and deploying SSIS packages for transferring data from flat files, excels spreadsheets and heterogeneous data to and from SQL Server.
  • Used Informatica Cloud (and other tools) to extract, transform, and load data into our custom-built Care warehouse.
  • Worked with Netezza /Teradata Database to implement data cleanup, performance-tuning techniques.
  • Experience with Netezza Database integration with Informatica and load processes with NetezzaBulk load utilities like Netezza Bulk read and write.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Lookup, Router and Aggregator to create robust mappings in the Informatica Power Center Designer.
  • Generating of unique keys for composite attributes while loading the data into Data Warehouse.
  • Created views using Cisco Data Virtualization Composite data source s from Data Warehouse Exadata, Oracle EBS. Published the views for Business users to extract the data in Spotfire for difference purposes.
  • Extensively used various Data Cleansing and Data Conversion functions like LTRIM, RTRIM, ISNULL, ISDATE, TO DATE, Decode, Substr, Instr, and IIF functions in Expression Transformation.
  • Responsible for best practices like naming conventions, Performance tuning, and Error Handling
  • Responsible for Performance Tuning at the Source level, Target level, Mapping Level and Session Level.
  • Used Address validator transformation in IDQ.
  • Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
  • Created partitioned tables, partitioned indexes for manageability and scalability of the application. Made use of Post-Session success and Post-Session failure commands in the session task to execute scripts needed for cleanup and update purposes.
  • Developed various mapping and tuning using Oracle and SQL*Plus in the ETL process.
  • Designed best practices on Process Sequence, Dictionaries, Data Quality Lifecycles, Naming Convention, and Version Control.
  • Created Use-Case Documents to explain and outline data behavior.
  • Working with Informatica Developer (IDQ) tool to ensure data quality to the consumers.
  • Used Address validator transformation for validating various customers address from various countries by using SOAP interface.
  • Created PL/SQL programs like procedures, function, packages, and cursors to extract data from Target System.
  • Perform analysis, design and implementation of batch processing workflows using Cisco Tidal Enterprise Scheduler Monitor daily cycles.
  • Scheduled batch jobs through Tidal Scheduler or the Unix Batch server to retrieve output files successfully to be sent to requesting parties.
  • Utilize Tidal Enterprise Scheduler functions to establish job streams with complex dependencies, manage intricate calendar schedules, Tidal agent installations with specific deliverables to 50 plus application teams throughout the environment.

Environment: Informatica Power Center 9.1, IDQ, SAS, Oracle 11g, UNIX, PLSQL, SQL* PLUS, SQL SERVER 2008 R2, SSIS, TOAD, MS Excel 2007.

Confidential, Walnut Creek, CA

Informatica Developer

Responsibilities:

  • Visualize a data architecture design from high level to low level, and design performance objects for each level
  • Troubleshooting database issues related to performance, queries, stored procedure.
  • Develop and maintain data marts on an enterprise data warehouse to support various defined dashboards such as Imperative for Quality (IQ) program.
  • Designed and develop data models and database architecture by translating abstract relationships into logical structures.
  • Proficient in defining Key Performance Metrics (KPIs), facts, dimensions, hierarchies and developing Star and Snow Flake schemas.
  • Extensively used flat files for Designed, developed complex Informatica mappings using expressions, aggregators, filters, lookup, and storedprocedures to ensure movement of the data between various applications.
  • Worked on SSIS script task, look up transformations and data flow tasks using T-SQL and Visual Basic (VB) scripts
  • Extracteddata from source systems to a DataMart running on Teradata.
  • Worked in extracting data from legacySystems such as mainframes, oracle to Teradata.
  • Source data analysis and data profiling for data warehouse projects.
  • Design and implement all stages of data life cycle. Maintain coding and naming standards.
  • Developed end-to-end ETL processes for Trade Management Data Mart Using Informatica.
  • Implemented various loads like Daily Loads, Weekly Loads, and Quarterly Loads using Incremental Loading Strategy.
  • Extensively worked with Slowly Changing Dimensions Type1 and Type2.
  • Worked extensively on PL/SQL as part of the process to develop several scripts to handle different scenarios.
  • Worked extensively on Informatica Partitioning when dealing with huge volumes of data and also partitioned the tables in Oracle for optimal performance.
  • Developed reconciliation scripts to validate the data loaded in the tables as part of unit testing.
  • Prepared SQL Queries to validate the data in both source and target databases.
  • Prepared scripts to email the records that do not satisfy the business rules (Error Records) to the uploaded business users.
  • Utilized of Informatica IDQ 8.6.1 to complete initial data profiling and matching/removing duplicate data.
  • Experienced to Profile, Analysis, Standardize, Clean, Integrate, Score Carding, and Reference Data from various source systems using Informatica Data Quality (IDQ) Toolkit. Worked with Address Doctor, different algorithms, Biogram/Jaro/Edit/Hamming/Reverse distance in IDQ.
  • Used Informatica data quality (IDQ) in cleaning and formatting customer master data.
  • Built logical data objects (LDO) and developed various mappings, Mapplets/rules using Informatica data quality (IDQ) based on requirements to profile, validate and cleanse the data.
  • Prepared the UNIX Shell Scripts to process the file uploads, one of the SOURCE for the data that process the uploads into different stages (Landing, Staging and Target tables)
  • Worked on TOAD and Oracle SQL to develop queries and create procedures.
  • Created the mapping specification, workflow specification and operations guide for the Informatica projects and MFT run book as part of end user training.
  • Experience working in agile methodology and ability to manage change effectively.
  • Assign work and responsible for providing technical expertise for the design and execution of ETL projects to onshore and offshore developers.

Environment: Informatica Power Center 8.6 (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Informatica Power Exchange, SQL, SSIS, Oracle 11g, Flat Files, UNIX Shell Scripting

Confidential

DW Engineer

Responsibilities:

  • Implement procedures to maintain, monitor, backup and recovery operations for ETL environment.
  • Conduct ETL optimization, troubleshooting and debugging.
  • Extensively used Informatica Designer to create and manipulate source and targetdefinitions, mappings, mapplets, transformations, re-usable transformations.
  • Written Complex SQL overrides for source qualifier and Lookups in mappings.
  • Planned, defined and designed data flow processes for data migration to the Data Warehouse using SSIS.
  • Export or Import data from other data sources like flat files using Import/Export through SSIS.
  • Created SSIS package to load data from Flat File to Data warehouse using Lookup, Derived Columns, Sort, Aggregate, Pivot Transformation and Slowly Changing Dimension.
  • Designed and developed validation scripts based on business rules to check the Quality of data loaded into EBS.
  • Created the Data Flow Diagrams for the full run and the reprocess partial run for the workflows to be created in Informatica taking into point the dependencies using Microsoft Visio.
  • Implemented best practices in ETL Design and development and ability to load data into highly normalized tables and star schemas.
  • Designed and developed mappings making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator, Web services Consumer, XML Generator Transformations.
  • Wrote UNIX shell scripts for Informatica ETL tool to run the Sessions.
  • Stored reformatted data from relational, flat file, XML files using Informatica (ETL).
  • Developed mapping to load the data in slowly changing dimension.
  • Involved in Design Review, code review, test review, and gave valuable suggestions.
  • Worked with different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic and Persistence) and Join cache while developing the Mappings.
  • Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions ) Type 1 and Type 2.
  • Responsible for offshore Code delivery and review process
  • Used Informatica to extract data from DB2, XML and Flat files to load the data into the Teradata
  • Prepared SQL Queries to validate the data in both source and target databases.
  • Extracted data from various data sources transformed and loaded into targets using Informatica.

Environment: Informatica Power Center 8.6,Oracle 9i, DB2, Sybase, Rapid Sql Server, SSIS, Erwin, UNIX.

We'd love your feedback!