We provide IT Staff Augmentation Services!

Sr Etl Developer/analyst Resume

4.00/5 (Submit Your Rating)

NC

SUMMARY

  • Over 8+ years of programming experience as an Oracle PL/SQL, ETL / Informatica Developer in the area of Data warehousing specialized in Extract Transform and Load (ETL) with particular expertise in Informatica, Informatica Data Quality(IDQ), Master Data Management(MDM), Webservices, Oracle, Teradata, OLAP, SSIS and Unix Shell Scripting. Specialized in Analysis, Design, Development, Implementation, Modeling, Testing, and support for Data warehousing (DWH) applications from various Heterogeneous Data Sources.
  • Experience on Data Warehousing with architecture, design, development and implementation of data warehouse ETL processes
  • Extensively involved in ETL Data warehousing using Informatica data quality (IDQ), Informatica PowerCenter 8.x/9 Designer tools like Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer, Transformation Developer, Workflow Manager and Workflow Monitor.
  • Developed mappings using Informatica Power Center Transformations - Lookup (Unconnected, Connected), Filter, Expression, Router, Joiner, Update Strategy, Aggregator, Stored Procedure, Sorter, and Sequence Generator.
  • Excellent working and conceptual knowledge of Oracle, PL/SQL, Developing Forms and Reports, C/C++, Java, Ajax, jQuery and HTML.
  • Experience in development of applications using Oracle 10g/9i/8i, SQL, PLSQL, Oracle Forms 10g/9i/6i, Reports 10g/9i/6i.
  • Good Experience in developing PLSQL Stored Procedures, Functions, Packages, Database Triggers using features like Cursors Ref Cursors, Analytical Functions, Collections, Exception Handling.
  • Extensive experience in analysis and design of database, database modeling, ER Diagrams,normalization and de-normalization.
  • Good experience as Data analyst for migrating data from Oracle to CSV files or Excel.
  • Strong experience using Oracle Data Integrator (ODI) and Oracle Warehouse Builder (OWB).
  • Tuned SQL queries using Explain Plan and performed refinement of the database design leading to significant improvement in system response time and efficiency.
  • Worked with Dimensional Data warehouses in Star and Snowflake Schemas
  • Created Type II & III SCDs (slowly changing dimensions) to keep track of historical data.
  • Expertise in loading data from legacy systems using SQL*Loader.
  • Have in-depth knowledge in Data analysis, Data warehousing and ETL techniques, Business Objects, SQL, PL/SQL scripts.
  • Good understanding of Relational database design and developed various data models using ERWIN and Oracle Designer.
  • Extensively worked with third party database tools like TOAD & PL/SQL Developer.
  • Experience in using oracle long data types like BLOB, CLOB
  • Experience with software Development life cycle (SDLC) process performing detailed analysis, collecting requirements, documents requirements, coding and unit testing, integration and system testing.
  • Experience in Coordinating with off-shore and on-site teams to explain client requirement.
  • Experience in Understanding complex performance issues and worked with DBA’s to suggest valuable ways to fix the problem.
  • Excellent Logical and Analytical Skills, Enthusiastic, and good Interpersonal Skills.

TECHNICAL SKILLS

RDBMS: Oracle 9i / 10g / 11g, SQL Server 2005/2008, Teradata

ETL Tools: Informatica Power Center 9.x / 8.x, Informatica Data Quality(IDQ) 9 / 8.x

Languages: SQL, PL/SQL, TSQL, VB, C#, UNIX Shell Scripting

Tools: TOAD, SQL Developer, Power builder, Tidal, Autosys

Oracle Utilities: SQL Loader, EXP/IMP, EXPDP/IMPDP, RMAN

Operating systems: Windows Server 2000/2003, Oracle Enterprise Linux, Red hat Linux

PROFESSIONAL EXPERIENCE

Confidential, NC

Sr ETL Developer/Analyst

Responsibilities:

  • Involved in design of database and created Data marts extensively using Star Schema.
  • Involved in implementing the data integrity validation checks through constraints and triggers.
  • Implemented Data Archival and Retirement Projects usingInformatica Data archivingtool.
  • Responsible for developing complex Informatica mappings using different types of transformations like UNION transformation, Connected and Unconnected LOOKUP transformations, Router, Filter, Aggregator, Expression and Update strategy transformations for Large volumes of Data.
  • Identified and eliminated duplicates in datasets thorough IDQ 9.x components of Edit Distance and Mixed Field matcher. It enables the creation of a single view of customers.
  • Ensure IDQ solutions are fit for purpose and deliver against business requirements.
  • Project life cycle - from analysis to production implementation, with emphasis on identifying the source and source data validation, developing particular logic and transformation as per the requirement and creating mappings and loading the data into different targets.
  • Support business User Acceptance Testing (UAT) activities, defining user test approaches and test scripts where appropriate for data profiling output and development of DQ rules.
  • Created scripts to schedule the process of inbound feeds and outbound extract using Tidal, Tivoli.
  • Dimensional data modeling using Data Modeling, Star Join Schema/Snowflake modeling, fact and dimensions tables, physical and logical data modeling, ERWIN.
  • Designed and developed Reporting System using Microstrategy, of developing all kinds of reports using Cognos report studio.
  • Extensively involved in application tuning, SQL tuning, memory tuning and I/O tuning using Explain Plan and SQL trace facilities.
  • Performance of the queries is enhanced by executing optimization techniques such as index creation, table partition and coding stored procedures.
  • Used SQL tools TOAD to run SQL queries and validate the data in warehouse.
  • Develop and maintain ETL scripts for Informatica Cloud Services.
  • Performed load and integration tests on all programs created and applied version control procedures to ensure that programs are properly implemented in production.

Environment: Informatica Power Center 9.1, Informatica Data Quality (IDQ), Power Designer, Power Exchange, Scheduler, Master Data Management(MDM), Work Flows, ETL, Microstrategy 9.3, Visio, Flat Files, OBIEE 10.x, Oracle 11g, XML, Webservices, SOAP, SOAP UI, SQL SERVER, Teradata, Citrix, TIDAL, TOAD 9.6.1, Business Objects XI, Rational Clear Case, Rational Clear Quest, Lotus, Unix Shell Scripts, BASH, SQL Navigator, Windows XP, TIVOLI, Autosys.

Confidential, NC

Sr ETL Developer/Analyst

Responsibilities:

  • Worked on salesforce.com in latest version of Informatica to pulling data and load them to target database.
  • Developed mappings using Informatica Power Center Transformations - Lookup (Unconnected, Connected), Filter, Expression, Router, Joiner, Update Strategy, Aggregator, Stored Procedure, Sorter, and Sequence Generator.
  • Expertise and experience in ETL design, ETL performance tuning and Data Modeling.
  • Expertise in designing and developing Business Intelligence solutions in Data Warehousing/Decision Support Systems using ETL tool Informatica Power Center 8.6.1
  • Hands-on and strong experience in developing Informatica Mappings from version 8.6.1 to 9.1.0
  • Extensively worked on Informatica Designer Components-Source Analyzer, Warehouse Designer, Mapping Designer &Mapplet Designer, Transformation Developer, Workflow Manager and Workflow Monitor.
  • Development, unit testing and migration in ETL Tool Informatica.
  • Experienced in integrating various databases Oracle, Flat files. Developing Core Code Mapping, sessions, workflow for Data Extraction, transformation and loading till Data Mart.
  • Involved in developing packages for implementing business logic through procedures and functions.
  • Developed and modified procedures, Function, triggers, forms, reports and deployed the changes in the system.
  • Developed user interfaces using Oracle Forms.
  • Performance of the queries is enhanced by executing optimization techniques such as index creation, table partition and coding stored procedures.
  • Responsible for creation of new users and setting up privileges and policies for protection of data.
  • Created reusable transformations, mapplets, and shortcuts for Sources and targets to use across various Mappings.
  • Created Shared folders, users and managed the privileges to users using Informatica Repository Manager.
  • Created shell scripts using Informatica PMCMD command.
  • Configured the sessions using workflow manager to have multiple partitions on Source data to improve the performance.
  • Created Persistent Lookup cache modified the look up SQL query and tuned transformations like Aggregator, Joiner and implemented Incremental aggregation for better performance.
  • Unit testing, User Acceptance Testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.

Environment: Informatica Power Center 8.6.1, Business Objects XI, Oracle11g, XML, TOAD,ETL, DWH, Teradata, Java, UNIX Scheduler, Citrix Web Interface, MS SQL Server 2008, TSQL, SSIS.

Confidential, PA

Sr. Informatica Developer/Analyst

Responsibilities:

  • Interacted with Business Users and Managers in gathering business requirements.
  • Worked on Dimension modeling as per business needs.
  • Configured and managed Informatica Repositories using Informatica administrator console.
  • Used Informatica Power Center Source Analyzer, Target Designer, Mapping Designer,
  • Mapplet, Designer and Transformation Developer to develop mappings and design workflows.
  • Created mappings using Informatica Designer and designed workflows using Workflow Manager to build DW as per business rules.
  • Most of the transformations used were like the Source Qualifier, Aggregator, Lookup, Router, Filter, Sequence Generator, Expression, Joiner and Update Strategy.
  • Developing and modifying changes in mappings according to the business logic.
  • Creating Mapping variables, Mapping Parameters, Session parameters.
  • Coding & Debugging, Sorting out their time-to-time technical problems.
  • Analyze the types of business tools and transformations to be applied.
  • Generated XML files to deliver to Thompson Reuters.
  • Used Fastload, Mload connection to load data into teradata tables.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size and target based commit interval. .
  • Implemented performance tuning logic on sources, mappings, sessions and targets in order to Provide maximum efficiency and performance.
  • Involved in writing the Unit Test Cases using SQL.
  • Involved in the creation of various change control forms to promote the code from Dev, QA and to Production.
  • Created various Autosys entries for scheduling various data cleansing scripts and loading.
  • Fixing the Bugs in the Mappings, Sessions and Parameter files.
  • Loaded the data from .CSV file to Oracle and Teradata
  • Developed the UNIX shell scripts to send out an E-mail on success of the process indicating the destination folder where the files are available.
  • Dealt with Unit Testing of all the mappings end to end and also with UAT.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.

Environment: Informatica Power center 8.1, Informatica Designer, Repository Manager, Workflow Manager,Power Exchange 8.1, Teradata 12, SQL Server 2005, Oracle 10g, TOAD, PL/SQL, and Windows XP.

Confidential, WI

Sr. ETL developer

Responsibilities:

  • Imported data from various sources (oracle, fixed width flat files) transformed and loaded into targets.
  • Using Informatica designer, developed mappings which populated the data into the target.
  • Worked with business analysts for business requirements, analysis and technical specifications.
  • Worked extensively on TOAD to analyze data and fix errors and develop.
  • Dimensional data modeling after determining the business rules.
  • Design and developed complex Informatica mappings including all slowly changing dimensions.
  • Worked on Informatica power centre tool - source analyzer, warehouse designer, mapping designer, mapplet designer and reusable transformations.
  • Using Informatica Power Center to develop the mappings and load data into the target.
  • Extensively used transformations like Router, Filter, Joiner, Source qualifier, look up both connected and unconnected, Expression, Aggregator, Update strategy and sequence generator.
  • Used parameters and variables at mapping and session levels to improve the performance and mappings.
  • Involved in fixing in-valid mappings and testing of the PL/SQL stored procedures and the target data.
  • Developed and maintained UNIX and Shell scripts for database process and conversions.
  • Worked closely with the Administrator for proper backup and recovery plans.

Environment: Informatica 9.1, oracle 11g, SQL, PL/SQL, TOAD, Windows XP and UNIX, Workflow Manager,Power Exchange 8.1, Teradata 12, SQL Server 2005, Oracle 10g, TOAD, PL/SQL, and Windows XP.

Confidential, Seattle, WA

ETL Consultant

Responsibilities:

  • Extracted data from Oracle database transformed and loaded into Teradata database according to the business specifications
  • Implemented logical and physical data modeling with Star and Snowflakes techniques using ERwin in Data Mart.
  • Involved in creating Informatica mappings in uploading the Data files into the Staging tables and then finally into Teradata Warehouse system.
  • Used the PL/SQL Procedures for Informatica mappings for truncating the data in target tables at run time.
  • Automated mappings to run using UNIX shell scripts, which included Pre and Post-session jobs and extracted data from Transaction System into Staging Area.
  • Used Workflow Manager for creating, validating, testing, running the sequential, concurrent Batches and Sessions.
  • Implemented slowly changing dimensions methodology to keep track of historical data.
  • Extensively wrote the BTEQ scripts to in corporate the transformation rules.
  • Developed Teradata Fastload scripts for initial loads, Mload scripts for data loads
  • Implemented modifications to improve performance of long running batch jobs with the help of changes in SQL statements, indexes and Explain Plans.
  • Involved in writing UNIX shell scripts for ETL tools to run the sessions.
  • Developed MLOAD scripts to load data from Load Ready Files to Teradata Warehouse.
  • Involved in the code reviews prepared by the teams.
  • Creating Test cases for Unit Test, System Integration Test and UAT to check the data quality.
  • Scheduled and monitored automated weekly jobs.
  • Optimized query performance, session performance and reliability.

Environment: Teradata V2R12, Teradata SQL Assistant, Teradata Manager, Erwin Designer, Informatica Power Center 8.2, UNIX, Autosys, Control M

Confidential

ETL Developer

Responsibilities:

  • Regularly perform Backup and Restore activities on development and test databases.
  • Implemented Data guard at client environment for minimizing data loss.
  • Collected User Requirements by participating in meetings and studying existing System.
  • Created database objects like Tables, Views, Sequences, Directories, Synonyms, Stored Procedures, Function and Packages, Cursor, Ref Cursor and Triggers.
  • Involved in analysis, design, coding and testing.
  • Involved in modifying various existing Packages, Procedures, Functions and Triggers according to new business needs.
  • Developed complex SQL queries using joins, sub queries and correlated sub queries to retrieve data from the database.
  • Used SQL LOADER to upload the information into the database.
  • Used Explain plan for query optimization.
  • Wrote test cases and performed unit testing.
  • Involved in writing Technical Specification Documentation.

Environment: Oracle 8i/9i, SQL Developer, TOAD, SQL Server, DB2, UNIX, Control M and Windows XP

We'd love your feedback!